Nov 27 16:03:42 crc systemd[1]: Starting Kubernetes Kubelet... Nov 27 16:03:42 crc restorecon[4686]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:42 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:43 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:44 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:44 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:44 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:44 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:44 crc restorecon[4686]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:44 crc restorecon[4686]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 16:03:44 crc restorecon[4686]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 27 16:03:44 crc kubenswrapper[4707]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 16:03:44 crc kubenswrapper[4707]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 27 16:03:44 crc kubenswrapper[4707]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 16:03:44 crc kubenswrapper[4707]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 16:03:44 crc kubenswrapper[4707]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 27 16:03:44 crc kubenswrapper[4707]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.914176 4707 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.927913 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.927969 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.927981 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.927992 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928005 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928017 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928028 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928040 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928052 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928062 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928072 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928083 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928093 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928103 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928114 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928125 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928135 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928146 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928160 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928175 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928187 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928197 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928208 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928218 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928228 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928237 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928264 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928274 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928284 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928294 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928304 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928314 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928324 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928334 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928344 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928354 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928399 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928411 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928421 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928431 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928442 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928452 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928462 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928473 4707 feature_gate.go:330] unrecognized feature gate: Example Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928484 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928494 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928504 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928518 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928532 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928546 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928557 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928570 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928583 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928593 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928602 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928613 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928623 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928634 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928645 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928657 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928667 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928677 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928687 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928697 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928713 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928725 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928736 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928747 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928759 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928770 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.928780 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.928991 4707 flags.go:64] FLAG: --address="0.0.0.0" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929061 4707 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929081 4707 flags.go:64] FLAG: --anonymous-auth="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929097 4707 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929111 4707 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929122 4707 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929138 4707 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929154 4707 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929166 4707 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929179 4707 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929191 4707 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929204 4707 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929216 4707 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929227 4707 flags.go:64] FLAG: --cgroup-root="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929239 4707 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929251 4707 flags.go:64] FLAG: --client-ca-file="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929262 4707 flags.go:64] FLAG: --cloud-config="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929275 4707 flags.go:64] FLAG: --cloud-provider="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929285 4707 flags.go:64] FLAG: --cluster-dns="[]" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929304 4707 flags.go:64] FLAG: --cluster-domain="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929315 4707 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929326 4707 flags.go:64] FLAG: --config-dir="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929338 4707 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929351 4707 flags.go:64] FLAG: --container-log-max-files="5" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929403 4707 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929415 4707 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929426 4707 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929438 4707 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929450 4707 flags.go:64] FLAG: --contention-profiling="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929461 4707 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929472 4707 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929485 4707 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929497 4707 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929513 4707 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929526 4707 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929537 4707 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929548 4707 flags.go:64] FLAG: --enable-load-reader="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929559 4707 flags.go:64] FLAG: --enable-server="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929571 4707 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929585 4707 flags.go:64] FLAG: --event-burst="100" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929596 4707 flags.go:64] FLAG: --event-qps="50" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929608 4707 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929619 4707 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929630 4707 flags.go:64] FLAG: --eviction-hard="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929644 4707 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929655 4707 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929667 4707 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929682 4707 flags.go:64] FLAG: --eviction-soft="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929692 4707 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929703 4707 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929714 4707 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929725 4707 flags.go:64] FLAG: --experimental-mounter-path="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929736 4707 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929749 4707 flags.go:64] FLAG: --fail-swap-on="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929762 4707 flags.go:64] FLAG: --feature-gates="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929777 4707 flags.go:64] FLAG: --file-check-frequency="20s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929789 4707 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929801 4707 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929812 4707 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929824 4707 flags.go:64] FLAG: --healthz-port="10248" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929838 4707 flags.go:64] FLAG: --help="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929850 4707 flags.go:64] FLAG: --hostname-override="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929861 4707 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929874 4707 flags.go:64] FLAG: --http-check-frequency="20s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929887 4707 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929898 4707 flags.go:64] FLAG: --image-credential-provider-config="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929909 4707 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929921 4707 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929932 4707 flags.go:64] FLAG: --image-service-endpoint="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929944 4707 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929955 4707 flags.go:64] FLAG: --kube-api-burst="100" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929966 4707 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929979 4707 flags.go:64] FLAG: --kube-api-qps="50" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.929989 4707 flags.go:64] FLAG: --kube-reserved="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930002 4707 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930013 4707 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930025 4707 flags.go:64] FLAG: --kubelet-cgroups="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930036 4707 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930048 4707 flags.go:64] FLAG: --lock-file="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930059 4707 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930071 4707 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930083 4707 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930118 4707 flags.go:64] FLAG: --log-json-split-stream="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930132 4707 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930143 4707 flags.go:64] FLAG: --log-text-split-stream="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930155 4707 flags.go:64] FLAG: --logging-format="text" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930166 4707 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930179 4707 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930190 4707 flags.go:64] FLAG: --manifest-url="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930202 4707 flags.go:64] FLAG: --manifest-url-header="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930218 4707 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930229 4707 flags.go:64] FLAG: --max-open-files="1000000" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930243 4707 flags.go:64] FLAG: --max-pods="110" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930256 4707 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930311 4707 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930325 4707 flags.go:64] FLAG: --memory-manager-policy="None" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930339 4707 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930352 4707 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930398 4707 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930410 4707 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930439 4707 flags.go:64] FLAG: --node-status-max-images="50" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930451 4707 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930463 4707 flags.go:64] FLAG: --oom-score-adj="-999" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930476 4707 flags.go:64] FLAG: --pod-cidr="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930487 4707 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930503 4707 flags.go:64] FLAG: --pod-manifest-path="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930515 4707 flags.go:64] FLAG: --pod-max-pids="-1" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930526 4707 flags.go:64] FLAG: --pods-per-core="0" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930537 4707 flags.go:64] FLAG: --port="10250" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930549 4707 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930560 4707 flags.go:64] FLAG: --provider-id="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930571 4707 flags.go:64] FLAG: --qos-reserved="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930582 4707 flags.go:64] FLAG: --read-only-port="10255" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930593 4707 flags.go:64] FLAG: --register-node="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930605 4707 flags.go:64] FLAG: --register-schedulable="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930616 4707 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930636 4707 flags.go:64] FLAG: --registry-burst="10" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930647 4707 flags.go:64] FLAG: --registry-qps="5" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930658 4707 flags.go:64] FLAG: --reserved-cpus="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930671 4707 flags.go:64] FLAG: --reserved-memory="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930695 4707 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930707 4707 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930720 4707 flags.go:64] FLAG: --rotate-certificates="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930733 4707 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930744 4707 flags.go:64] FLAG: --runonce="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930755 4707 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930767 4707 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930780 4707 flags.go:64] FLAG: --seccomp-default="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930791 4707 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930803 4707 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930816 4707 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930829 4707 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930841 4707 flags.go:64] FLAG: --storage-driver-password="root" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930852 4707 flags.go:64] FLAG: --storage-driver-secure="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930862 4707 flags.go:64] FLAG: --storage-driver-table="stats" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930873 4707 flags.go:64] FLAG: --storage-driver-user="root" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930884 4707 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930896 4707 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930908 4707 flags.go:64] FLAG: --system-cgroups="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930919 4707 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930938 4707 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930949 4707 flags.go:64] FLAG: --tls-cert-file="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930960 4707 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930977 4707 flags.go:64] FLAG: --tls-min-version="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.930987 4707 flags.go:64] FLAG: --tls-private-key-file="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.931000 4707 flags.go:64] FLAG: --topology-manager-policy="none" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.931011 4707 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.931024 4707 flags.go:64] FLAG: --topology-manager-scope="container" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.931036 4707 flags.go:64] FLAG: --v="2" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.931053 4707 flags.go:64] FLAG: --version="false" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.931067 4707 flags.go:64] FLAG: --vmodule="" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.931080 4707 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.931096 4707 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931355 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931405 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931420 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931430 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931439 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931449 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931458 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931466 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931475 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931488 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931499 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931510 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931519 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931529 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931538 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931549 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931559 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931568 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931578 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931588 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931600 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931610 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931620 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931629 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931639 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931653 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931666 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931677 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931690 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931704 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931717 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931728 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931740 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931750 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931761 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931772 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931783 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931793 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931806 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931816 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931826 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931841 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931855 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931866 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931876 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931887 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931898 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931909 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931919 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931929 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931938 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931949 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931959 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931972 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931983 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.931992 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932002 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932013 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932022 4707 feature_gate.go:330] unrecognized feature gate: Example Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932032 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932042 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932052 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932063 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932073 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932083 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932091 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932100 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932107 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932116 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932123 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.932131 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.932855 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.946617 4707 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.946697 4707 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946851 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946873 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946884 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946899 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946910 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946919 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946927 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946936 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946945 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946954 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946962 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946970 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946978 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946985 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.946993 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947002 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947011 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947020 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947029 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947038 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947046 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947054 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947062 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947070 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947078 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947086 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947094 4707 feature_gate.go:330] unrecognized feature gate: Example Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947102 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947111 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947118 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947127 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947135 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947143 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947151 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947161 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947170 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947178 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947185 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947193 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947201 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947211 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947270 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947280 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947290 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947299 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947307 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947317 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947325 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947333 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947340 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947349 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947357 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947365 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947398 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947408 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947420 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947431 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947439 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947448 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947457 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947465 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947474 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947482 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947490 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947499 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947507 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947516 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947524 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947533 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947542 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947552 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.947567 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947814 4707 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947825 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947835 4707 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947843 4707 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947852 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947860 4707 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947869 4707 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947877 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947885 4707 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947895 4707 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947903 4707 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947911 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947918 4707 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947926 4707 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947934 4707 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947943 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947950 4707 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947958 4707 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947966 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947974 4707 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947982 4707 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947990 4707 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.947998 4707 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948006 4707 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948014 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948022 4707 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948032 4707 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948041 4707 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948049 4707 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948057 4707 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948065 4707 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948073 4707 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948083 4707 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948093 4707 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948104 4707 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948114 4707 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948123 4707 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948132 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948140 4707 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948149 4707 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948162 4707 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948173 4707 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948183 4707 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948191 4707 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948199 4707 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948206 4707 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948215 4707 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948222 4707 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948230 4707 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948238 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948246 4707 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948254 4707 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948262 4707 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948270 4707 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948278 4707 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948285 4707 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948293 4707 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948301 4707 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948308 4707 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948318 4707 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948327 4707 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948335 4707 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948343 4707 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948351 4707 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948359 4707 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948388 4707 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948396 4707 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948405 4707 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948413 4707 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948420 4707 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 16:03:44 crc kubenswrapper[4707]: W1127 16:03:44.948430 4707 feature_gate.go:330] unrecognized feature gate: Example Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.948443 4707 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.948737 4707 server.go:940] "Client rotation is on, will bootstrap in background" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.955013 4707 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.955162 4707 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.956812 4707 server.go:997] "Starting client certificate rotation" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.956859 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.957087 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-11 15:35:29.878958951 +0000 UTC Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.957188 4707 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1079h31m44.921775673s for next certificate rotation Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.991352 4707 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 16:03:44 crc kubenswrapper[4707]: I1127 16:03:44.994358 4707 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.015132 4707 log.go:25] "Validated CRI v1 runtime API" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.058345 4707 log.go:25] "Validated CRI v1 image API" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.061529 4707 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.069098 4707 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-27-15-59-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.069157 4707 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.096553 4707 manager.go:217] Machine: {Timestamp:2025-11-27 16:03:45.09384139 +0000 UTC m=+0.725290218 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:dd7e61b5-f6f4-4240-af78-d8fe5d6daad3 BootID:3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0e:7d:92 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0e:7d:92 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4f:aa:f7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4b:2d:ce Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b8:4e:f4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:d0:77:64 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b6:81:9d:1c:56:52 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:59:b4:9c:7b:9c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.097058 4707 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.097338 4707 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.099498 4707 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.099853 4707 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.099924 4707 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.100307 4707 topology_manager.go:138] "Creating topology manager with none policy" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.100328 4707 container_manager_linux.go:303] "Creating device plugin manager" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.100919 4707 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.100986 4707 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.101491 4707 state_mem.go:36] "Initialized new in-memory state store" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.101649 4707 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.105784 4707 kubelet.go:418] "Attempting to sync node with API server" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.105829 4707 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.105857 4707 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.105882 4707 kubelet.go:324] "Adding apiserver pod source" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.105909 4707 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.110135 4707 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.111273 4707 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.113724 4707 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115554 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115601 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115618 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115633 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115720 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115796 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115820 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115889 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115909 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115923 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115965 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.115980 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.116027 4707 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.116081 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.116090 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.116206 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.116248 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.116885 4707 server.go:1280] "Started kubelet" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.117139 4707 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.117099 4707 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.117683 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.118105 4707 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 27 16:03:45 crc systemd[1]: Started Kubernetes Kubelet. Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.125533 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.125776 4707 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.125781 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 14:05:41.532653885 +0000 UTC Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.127426 4707 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.127692 4707 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.127710 4707 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.127744 4707 server.go:460] "Adding debug handlers to kubelet server" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.127895 4707 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.126087 4707 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1126h1m56.406814811s for next certificate rotation Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.129004 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.136118 4707 factory.go:55] Registering systemd factory Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.136165 4707 factory.go:221] Registration of the systemd container factory successfully Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.136466 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.136580 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.137865 4707 factory.go:153] Registering CRI-O factory Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.137925 4707 factory.go:221] Registration of the crio container factory successfully Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.138054 4707 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.138098 4707 factory.go:103] Registering Raw factory Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.138132 4707 manager.go:1196] Started watching for new ooms in manager Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.136868 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187be89985b8b991 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 16:03:45.116797329 +0000 UTC m=+0.748246127,LastTimestamp:2025-11-27 16:03:45.116797329 +0000 UTC m=+0.748246127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.139347 4707 manager.go:319] Starting recovery of all containers Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.148671 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.148760 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.148789 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.148851 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.148882 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.148913 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.148943 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.148971 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149053 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149085 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149104 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149125 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149181 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149207 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149228 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149252 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149292 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149319 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149343 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149509 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149549 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149578 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149657 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149683 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149707 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149734 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149761 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149783 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149801 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149819 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149837 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149856 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149878 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149896 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149915 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149937 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149960 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149978 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.149999 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150017 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150036 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150059 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150077 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150095 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150115 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150135 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150155 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150174 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150195 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150213 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150234 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150254 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150355 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150409 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150433 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150454 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150473 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150495 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150514 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150532 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150555 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150582 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150609 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150637 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150665 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150685 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150706 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150726 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150745 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150763 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150781 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150801 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150819 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150842 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150860 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150879 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150898 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150917 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150947 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150965 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.150984 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151002 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151023 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151042 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151063 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151084 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151104 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151121 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151175 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151194 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151216 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151234 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151254 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151272 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151291 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151310 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151329 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151346 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151364 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151412 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151432 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151450 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151469 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151501 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151528 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151550 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151571 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151593 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151614 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151632 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151651 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151683 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151713 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151741 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151768 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151796 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151817 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151835 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151853 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151870 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151889 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151910 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151927 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151945 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151963 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.151983 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152002 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152021 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152042 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152060 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152077 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152095 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152113 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152134 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152152 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152169 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152188 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152209 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152227 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152245 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152265 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152285 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152304 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152328 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152350 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152418 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152439 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152457 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152481 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152499 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152517 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152536 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152555 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152577 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152595 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152615 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152634 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152655 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152678 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152696 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152716 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152738 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152758 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152776 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152822 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152842 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152859 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152880 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152898 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152917 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152935 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152954 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152974 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.152992 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153010 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153034 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153054 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153072 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153093 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153120 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153148 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153173 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153198 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153218 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153236 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153257 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153275 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153292 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.153312 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156256 4707 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156404 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156428 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156457 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156474 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156500 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156516 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156531 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156560 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.156577 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159348 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159582 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159609 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159689 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159711 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159736 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159768 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159786 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159813 4707 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159826 4707 reconstruct.go:97] "Volume reconstruction finished" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.159836 4707 reconciler.go:26] "Reconciler: start to sync state" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.177381 4707 manager.go:324] Recovery completed Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.186891 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.190061 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.190116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.190128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.191237 4707 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.191261 4707 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.191206 4707 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.191280 4707 state_mem.go:36] "Initialized new in-memory state store" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.193779 4707 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.193824 4707 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.193855 4707 kubelet.go:2335] "Starting kubelet main sync loop" Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.194565 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.194612 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.194679 4707 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.209544 4707 policy_none.go:49] "None policy: Start" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.212216 4707 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.212469 4707 state_mem.go:35] "Initializing new in-memory state store" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.227822 4707 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.293933 4707 manager.go:334] "Starting Device Plugin manager" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.294021 4707 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.294044 4707 server.go:79] "Starting device plugin registration server" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.294753 4707 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.294789 4707 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.294900 4707 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.294905 4707 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.295011 4707 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.295026 4707 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.305719 4707 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.330440 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.395744 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.397350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.397457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.397540 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.397627 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.398459 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.495970 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.496133 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.497822 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.497874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.497912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.498177 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.498448 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.498507 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.499675 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.499711 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.499725 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.499847 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.500060 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.500130 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.500640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.500694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.500717 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.501558 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.501604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.501623 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.501817 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.501889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.501947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.501906 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.502056 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.501831 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.503025 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.503054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.503065 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.503289 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.503310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.503321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.503719 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.503948 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.504022 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.504599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.504643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.504659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.504903 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.504944 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.506753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.506817 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.506840 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.510466 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.510535 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.510561 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565199 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565434 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565667 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565833 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.565900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.598582 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.600537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.600634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.600653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.600698 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.601478 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667785 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667820 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667857 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667965 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667994 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.667998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668196 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668268 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.668217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: E1127 16:03:45.732161 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.842867 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.857359 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.886096 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.908055 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-7c62b7eaf6b0fdd4d4670810721d2a085edaca637542a348a823f64beed6951c WatchSource:0}: Error finding container 7c62b7eaf6b0fdd4d4670810721d2a085edaca637542a348a823f64beed6951c: Status 404 returned error can't find the container with id 7c62b7eaf6b0fdd4d4670810721d2a085edaca637542a348a823f64beed6951c Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.909151 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-0d2152b558849703b2f8c44fd9e38d227449f02125c8dd024847df8f6e9db980 WatchSource:0}: Error finding container 0d2152b558849703b2f8c44fd9e38d227449f02125c8dd024847df8f6e9db980: Status 404 returned error can't find the container with id 0d2152b558849703b2f8c44fd9e38d227449f02125c8dd024847df8f6e9db980 Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.913758 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.918296 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-214d80088dedf31c8f484c375816032497aa3545a546fa436ce6dd33753ef1e4 WatchSource:0}: Error finding container 214d80088dedf31c8f484c375816032497aa3545a546fa436ce6dd33753ef1e4: Status 404 returned error can't find the container with id 214d80088dedf31c8f484c375816032497aa3545a546fa436ce6dd33753ef1e4 Nov 27 16:03:45 crc kubenswrapper[4707]: I1127 16:03:45.924595 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.930808 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6f8477c9e9c6cb8cef7e13547b417360080215002c5c7572b74cad86bb8d6aef WatchSource:0}: Error finding container 6f8477c9e9c6cb8cef7e13547b417360080215002c5c7572b74cad86bb8d6aef: Status 404 returned error can't find the container with id 6f8477c9e9c6cb8cef7e13547b417360080215002c5c7572b74cad86bb8d6aef Nov 27 16:03:45 crc kubenswrapper[4707]: W1127 16:03:45.943299 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-c6617bf94dc4eb5bca53a408f2c6e70f9cf17839220e3c08b0f6ab858a14638c WatchSource:0}: Error finding container c6617bf94dc4eb5bca53a408f2c6e70f9cf17839220e3c08b0f6ab858a14638c: Status 404 returned error can't find the container with id c6617bf94dc4eb5bca53a408f2c6e70f9cf17839220e3c08b0f6ab858a14638c Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.001635 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.003804 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.003844 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.003860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.003895 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:03:46 crc kubenswrapper[4707]: E1127 16:03:46.004843 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.118783 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.206794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6f8477c9e9c6cb8cef7e13547b417360080215002c5c7572b74cad86bb8d6aef"} Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.210491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"214d80088dedf31c8f484c375816032497aa3545a546fa436ce6dd33753ef1e4"} Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.211796 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0d2152b558849703b2f8c44fd9e38d227449f02125c8dd024847df8f6e9db980"} Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.215548 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7c62b7eaf6b0fdd4d4670810721d2a085edaca637542a348a823f64beed6951c"} Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.216539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c6617bf94dc4eb5bca53a408f2c6e70f9cf17839220e3c08b0f6ab858a14638c"} Nov 27 16:03:46 crc kubenswrapper[4707]: W1127 16:03:46.275808 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:46 crc kubenswrapper[4707]: E1127 16:03:46.275928 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:46 crc kubenswrapper[4707]: W1127 16:03:46.302800 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:46 crc kubenswrapper[4707]: E1127 16:03:46.302907 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:46 crc kubenswrapper[4707]: W1127 16:03:46.348167 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:46 crc kubenswrapper[4707]: E1127 16:03:46.348279 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:46 crc kubenswrapper[4707]: W1127 16:03:46.502731 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:46 crc kubenswrapper[4707]: E1127 16:03:46.503846 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:46 crc kubenswrapper[4707]: E1127 16:03:46.567627 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.805036 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.807984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.808023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.808038 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:46 crc kubenswrapper[4707]: I1127 16:03:46.808072 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:03:46 crc kubenswrapper[4707]: E1127 16:03:46.808658 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.118487 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.221687 4707 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777" exitCode=0 Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.221798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777"} Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.221852 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.222765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.222804 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.222821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.224989 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a"} Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.225021 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d"} Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.225033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13"} Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.228080 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5" exitCode=0 Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.228129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5"} Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.228193 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.228993 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.229023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.229031 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.230288 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.230407 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8" exitCode=0 Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.230509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8"} Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.230579 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.230907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.230949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.230965 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.231661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.231700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.231715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.231814 4707 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217" exitCode=0 Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.231836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217"} Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.231911 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.232652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.232686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:47 crc kubenswrapper[4707]: I1127 16:03:47.232698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:47 crc kubenswrapper[4707]: W1127 16:03:47.915148 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:47 crc kubenswrapper[4707]: E1127 16:03:47.915303 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.119406 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 27 16:03:48 crc kubenswrapper[4707]: E1127 16:03:48.169130 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="3.2s" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.245875 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c" exitCode=0 Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.246004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.246014 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.247334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.247405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.247428 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.248159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5738c0efc1bf2bd3189a81861d6548f0599bbbef152df871c689378185304b7f"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.248189 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.249511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.249540 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.249549 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.251209 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.251241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.251295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.251306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.252480 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.252646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.252763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.254488 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.254569 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.255291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.255321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.255336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.261990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.262033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.262049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.262062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce"} Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.409145 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.411475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.411527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.411538 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:48 crc kubenswrapper[4707]: I1127 16:03:48.411559 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:03:48 crc kubenswrapper[4707]: E1127 16:03:48.412156 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.271759 4707 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992" exitCode=0 Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.271883 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992"} Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.271938 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.274473 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.274591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.274693 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.277671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9"} Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.277740 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.277817 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.277852 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.277903 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.277919 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.278781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.278858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.278937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.279141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.279171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.279183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.279202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.279227 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.279240 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.280135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.280164 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.280176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.284408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:49 crc kubenswrapper[4707]: I1127 16:03:49.348989 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.286827 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3"} Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.287461 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92"} Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.287500 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20"} Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.286968 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.286898 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.287606 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.289461 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.289523 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.289553 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.290180 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.290234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.290249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:50 crc kubenswrapper[4707]: I1127 16:03:50.537135 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.297527 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.297606 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.297647 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.297522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2"} Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.297886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5"} Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.299354 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.299434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.299456 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.299486 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.299580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.299590 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.612635 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.614544 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.614629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.614650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.614699 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:03:51 crc kubenswrapper[4707]: I1127 16:03:51.851038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.301575 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.301637 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.301686 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.303607 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.303665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.303685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.304122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.304171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.304190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.509218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.509720 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.511610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.511671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.511683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:52 crc kubenswrapper[4707]: I1127 16:03:52.517819 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.304519 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.304667 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.305625 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.306423 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.306452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.306470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.306518 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.306526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:53 crc kubenswrapper[4707]: I1127 16:03:53.306548 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.308542 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.310885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.310989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.311013 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.984558 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.985254 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.987755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.987918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:54 crc kubenswrapper[4707]: I1127 16:03:54.987942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:55 crc kubenswrapper[4707]: E1127 16:03:55.306143 4707 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 27 16:03:55 crc kubenswrapper[4707]: I1127 16:03:55.692813 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:03:55 crc kubenswrapper[4707]: I1127 16:03:55.693142 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:55 crc kubenswrapper[4707]: I1127 16:03:55.695221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:55 crc kubenswrapper[4707]: I1127 16:03:55.695288 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:55 crc kubenswrapper[4707]: I1127 16:03:55.695308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:56 crc kubenswrapper[4707]: I1127 16:03:56.850453 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:56 crc kubenswrapper[4707]: I1127 16:03:56.850730 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:56 crc kubenswrapper[4707]: I1127 16:03:56.852932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:56 crc kubenswrapper[4707]: I1127 16:03:56.853008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:56 crc kubenswrapper[4707]: I1127 16:03:56.853037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:56 crc kubenswrapper[4707]: I1127 16:03:56.856016 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:03:57 crc kubenswrapper[4707]: I1127 16:03:57.317350 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:57 crc kubenswrapper[4707]: I1127 16:03:57.319152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:57 crc kubenswrapper[4707]: I1127 16:03:57.319218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:57 crc kubenswrapper[4707]: I1127 16:03:57.319237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:58 crc kubenswrapper[4707]: I1127 16:03:58.288878 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 27 16:03:58 crc kubenswrapper[4707]: I1127 16:03:58.289243 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:03:58 crc kubenswrapper[4707]: I1127 16:03:58.291286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:03:58 crc kubenswrapper[4707]: I1127 16:03:58.291438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:03:58 crc kubenswrapper[4707]: I1127 16:03:58.291465 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:03:58 crc kubenswrapper[4707]: E1127 16:03:58.904114 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187be89985b8b991 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 16:03:45.116797329 +0000 UTC m=+0.748246127,LastTimestamp:2025-11-27 16:03:45.116797329 +0000 UTC m=+0.748246127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 16:03:59 crc kubenswrapper[4707]: W1127 16:03:59.070049 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 27 16:03:59 crc kubenswrapper[4707]: I1127 16:03:59.070247 4707 trace.go:236] Trace[1071779351]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 16:03:49.068) (total time: 10002ms): Nov 27 16:03:59 crc kubenswrapper[4707]: Trace[1071779351]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (16:03:59.070) Nov 27 16:03:59 crc kubenswrapper[4707]: Trace[1071779351]: [10.002172957s] [10.002172957s] END Nov 27 16:03:59 crc kubenswrapper[4707]: E1127 16:03:59.070293 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 27 16:03:59 crc kubenswrapper[4707]: I1127 16:03:59.118941 4707 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 27 16:03:59 crc kubenswrapper[4707]: I1127 16:03:59.284316 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 16:03:59 crc kubenswrapper[4707]: I1127 16:03:59.284457 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 16:03:59 crc kubenswrapper[4707]: W1127 16:03:59.446080 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 27 16:03:59 crc kubenswrapper[4707]: I1127 16:03:59.446208 4707 trace.go:236] Trace[1676419008]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 16:03:49.444) (total time: 10001ms): Nov 27 16:03:59 crc kubenswrapper[4707]: Trace[1676419008]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:03:59.446) Nov 27 16:03:59 crc kubenswrapper[4707]: Trace[1676419008]: [10.00186617s] [10.00186617s] END Nov 27 16:03:59 crc kubenswrapper[4707]: E1127 16:03:59.446243 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 27 16:03:59 crc kubenswrapper[4707]: W1127 16:03:59.491339 4707 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 27 16:03:59 crc kubenswrapper[4707]: I1127 16:03:59.491492 4707 trace.go:236] Trace[239417892]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 16:03:49.490) (total time: 10001ms): Nov 27 16:03:59 crc kubenswrapper[4707]: Trace[239417892]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (16:03:59.491) Nov 27 16:03:59 crc kubenswrapper[4707]: Trace[239417892]: [10.001057468s] [10.001057468s] END Nov 27 16:03:59 crc kubenswrapper[4707]: E1127 16:03:59.491518 4707 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 27 16:03:59 crc kubenswrapper[4707]: I1127 16:03:59.851041 4707 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 16:03:59 crc kubenswrapper[4707]: I1127 16:03:59.851168 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 27 16:04:00 crc kubenswrapper[4707]: I1127 16:04:00.078879 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 27 16:04:00 crc kubenswrapper[4707]: I1127 16:04:00.078995 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 27 16:04:03 crc kubenswrapper[4707]: I1127 16:04:03.388936 4707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.290013 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.290319 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.292043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.292115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.292135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.297590 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.337237 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.338623 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.338732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.338757 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.774948 4707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 27 16:04:04 crc kubenswrapper[4707]: I1127 16:04:04.952668 4707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.078245 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.080309 4707 trace.go:236] Trace[1314093213]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 16:03:51.904) (total time: 13175ms): Nov 27 16:04:05 crc kubenswrapper[4707]: Trace[1314093213]: ---"Objects listed" error: 13175ms (16:04:05.080) Nov 27 16:04:05 crc kubenswrapper[4707]: Trace[1314093213]: [13.175906658s] [13.175906658s] END Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.080330 4707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.081317 4707 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.083092 4707 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.121012 4707 apiserver.go:52] "Watching apiserver" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.123181 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46950->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.123239 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46950->192.168.126.11:17697: read: connection reset by peer" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.123301 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46964->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.123560 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:46964->192.168.126.11:17697: read: connection reset by peer" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.124322 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.124351 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.127884 4707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.128211 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.129010 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.129057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.129108 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.129121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.129789 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.129593 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.129682 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.129265 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.129861 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.131970 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.132149 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.132372 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.132920 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.133218 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.133433 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.133279 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.133296 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.134046 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.161288 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.175204 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.191929 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.220145 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.229133 4707 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.239132 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.250503 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.280253 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282611 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282663 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282708 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282732 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282801 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282822 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282843 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282889 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282940 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.282988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283062 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283196 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283222 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283319 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283352 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283385 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283423 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283477 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283591 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283610 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283648 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283668 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283717 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283819 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283839 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283910 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283928 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283927 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283965 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.283980 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284000 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284033 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284097 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284111 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284110 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284127 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284169 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284207 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284241 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284271 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284300 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284331 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284368 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284420 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284468 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284528 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284567 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284816 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284856 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.284974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285296 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285306 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285494 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.285612 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:04:05.785580808 +0000 UTC m=+21.417029576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285644 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285929 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.285935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286022 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286124 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.288195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286620 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.288284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286816 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.286957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287089 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287132 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287204 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287560 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287587 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.287976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.288693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.288778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.288940 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.289028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.289186 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.289630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.289633 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.289693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.289648 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.289898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.289926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.290134 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.290348 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.290630 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.291479 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.291747 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.291784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.291991 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292298 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292742 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292758 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292761 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292861 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292958 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.292996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293009 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293031 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293031 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293054 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293095 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293115 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293138 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293184 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293205 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293259 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293470 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293626 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293652 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293710 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293790 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293935 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293947 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293991 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294063 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294114 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294137 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294203 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294227 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294253 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294275 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294335 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294361 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.293791 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294417 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294471 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294493 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294618 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294762 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294782 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294967 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295034 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.294995 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295103 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295131 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295188 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295240 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295267 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295617 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295638 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295656 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295997 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.295996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296044 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296127 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296398 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296512 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296711 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296853 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296909 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.296979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.297181 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.297297 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.297451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.297549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.297841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.298106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.298116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.298635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.298680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.298799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.298951 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.299022 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.299345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.299527 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.300695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.301062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.301443 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.301936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.306153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.309081 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.309318 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310087 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310309 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310434 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310703 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310802 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310916 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311023 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311232 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311465 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311574 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311685 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311943 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312144 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312475 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312621 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.312959 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313367 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.313920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314237 4707 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314331 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314432 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314519 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314608 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314693 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314772 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314856 4707 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314937 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.315015 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.315096 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.315169 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.315246 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.315341 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.319615 4707 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.319721 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.319782 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.319842 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.319900 4707 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.319959 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320012 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320069 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320131 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320188 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320249 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320309 4707 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320386 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320452 4707 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320517 4707 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320571 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320628 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320687 4707 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320749 4707 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320816 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320881 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320938 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320993 4707 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321048 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321106 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321163 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321219 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321271 4707 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321322 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321401 4707 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321536 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321641 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321728 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321794 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321848 4707 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321905 4707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321960 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322012 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322070 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322124 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322189 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322241 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322297 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322348 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322440 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322506 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322566 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322619 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322679 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322742 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322810 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322867 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322925 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.322978 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323028 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323086 4707 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323140 4707 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323263 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323318 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323390 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323447 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323498 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323587 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323646 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323705 4707 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323762 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323822 4707 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323874 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323925 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.323985 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324041 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324092 4707 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324144 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324198 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324254 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324308 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324363 4707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324435 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324497 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324553 4707 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324613 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324668 4707 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324737 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324793 4707 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324844 4707 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324898 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.324959 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325016 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325072 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325124 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325180 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325238 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325292 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325347 4707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325419 4707 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325477 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325538 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325599 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325655 4707 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325711 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.325762 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.320144 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310355 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.310917 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311065 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311073 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.311769 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.314302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.315566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.316532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.321484 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.326669 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.326848 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.326910 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.329444 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.330608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.330855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.331493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.332049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.332294 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.332357 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.332457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.332653 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.332754 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.332914 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.333488 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.333720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.333918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.334052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.334279 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.335162 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.336034 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.336591 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.336660 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:05.836639187 +0000 UTC m=+21.468087955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.337625 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.337646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.337818 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.337978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.338326 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.338712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.339148 4707 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.342359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.342418 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.348983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.351347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.352656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.355033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.355362 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.355775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.356357 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.356496 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:05.85647576 +0000 UTC m=+21.487924528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.356784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.357706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.362895 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.363348 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.364531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.367801 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.359316 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.359501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.361681 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.361876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.361977 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.362436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.362777 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.362831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.362843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.363060 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.363609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.365563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.367140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.367592 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.367744 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.367803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.369493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.373354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.373578 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.373690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.374180 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.374198 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.377633 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9" exitCode=255 Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.377683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9"} Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.387756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.403993 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.405138 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.405164 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.405177 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.405251 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:05.905229932 +0000 UTC m=+21.536678700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.405500 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.405740 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.406508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.407080 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.407160 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.407201 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.407320 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:05.907249892 +0000 UTC m=+21.538698650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.417902 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.420248 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.420881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.426219 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.426661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.426760 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427039 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427209 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427226 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427247 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427270 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427280 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427291 4707 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427306 4707 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427316 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427328 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427338 4707 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427357 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427370 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427394 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427406 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427419 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427437 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427445 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427457 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427466 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427491 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427502 4707 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427513 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427522 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427532 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427541 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427552 4707 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.427561 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429015 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429123 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429196 4707 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429257 4707 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429313 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429373 4707 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429464 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429522 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429578 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429788 4707 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429826 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429839 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429850 4707 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429861 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429872 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429883 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429893 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429903 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429913 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.429922 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430124 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430137 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430149 4707 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430160 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430169 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430178 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430189 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430198 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430209 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430218 4707 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430227 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430243 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430253 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430264 4707 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430274 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430283 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430293 4707 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430302 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430312 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430321 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430330 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430340 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430352 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430362 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430375 4707 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430407 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430417 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430426 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.430435 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.445552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.445906 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.466114 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.466684 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.472570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.473705 4707 scope.go:117] "RemoveContainer" containerID="4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.475948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.476699 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.482318 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: W1127 16:04:05.482460 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-43836eee88fdcc8fb81802d64cd1fa8b39dee54190b1a368cf46de7c1654614a WatchSource:0}: Error finding container 43836eee88fdcc8fb81802d64cd1fa8b39dee54190b1a368cf46de7c1654614a: Status 404 returned error can't find the container with id 43836eee88fdcc8fb81802d64cd1fa8b39dee54190b1a368cf46de7c1654614a Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.483608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:04:05 crc kubenswrapper[4707]: W1127 16:04:05.491356 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c3e72aacb4c1bad5e8209e2b3f15ffbfa9dd4908123b8eba085ece279ff4286d WatchSource:0}: Error finding container c3e72aacb4c1bad5e8209e2b3f15ffbfa9dd4908123b8eba085ece279ff4286d: Status 404 returned error can't find the container with id c3e72aacb4c1bad5e8209e2b3f15ffbfa9dd4908123b8eba085ece279ff4286d Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.505570 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.523846 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.531072 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.531110 4707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.531123 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.531136 4707 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.534466 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.548675 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.571138 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.582568 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.592947 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.610106 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.835990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.836225 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:04:06.836189122 +0000 UTC m=+22.467637890 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.936418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.936463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.936481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:05 crc kubenswrapper[4707]: I1127 16:04:05.936503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936623 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936686 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:06.936671279 +0000 UTC m=+22.568120037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936684 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936717 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936765 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936816 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936836 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936730 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936867 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.936836 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:06.936813713 +0000 UTC m=+22.568262481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.937030 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:06.936972526 +0000 UTC m=+22.568421304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:05 crc kubenswrapper[4707]: E1127 16:04:05.937066 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:06.937054239 +0000 UTC m=+22.568503107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.381553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c3e72aacb4c1bad5e8209e2b3f15ffbfa9dd4908123b8eba085ece279ff4286d"} Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.382963 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f"} Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.382994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4"} Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.383009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"af7a7c7c58321d65f7b7fde44196d1bbbaed6619b29124d2f52f52c80dc7fa89"} Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.384303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d"} Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.384404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"43836eee88fdcc8fb81802d64cd1fa8b39dee54190b1a368cf46de7c1654614a"} Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.388015 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.390221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8"} Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.390597 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.403856 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.418874 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.430158 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.445233 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.467594 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.472411 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bhmsc"] Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.472769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bhmsc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.474541 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.474826 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.474981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.484095 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.497998 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.516122 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.529573 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.541489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1ec526b-6fb0-4c87-bd87-6aaf843e0c78-hosts-file\") pod \"node-resolver-bhmsc\" (UID: \"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\") " pod="openshift-dns/node-resolver-bhmsc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.541548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvs85\" (UniqueName: \"kubernetes.io/projected/d1ec526b-6fb0-4c87-bd87-6aaf843e0c78-kube-api-access-kvs85\") pod \"node-resolver-bhmsc\" (UID: \"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\") " pod="openshift-dns/node-resolver-bhmsc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.541692 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.554250 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.571160 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.582660 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.595559 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.606638 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.642499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvs85\" (UniqueName: \"kubernetes.io/projected/d1ec526b-6fb0-4c87-bd87-6aaf843e0c78-kube-api-access-kvs85\") pod \"node-resolver-bhmsc\" (UID: \"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\") " pod="openshift-dns/node-resolver-bhmsc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.642556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1ec526b-6fb0-4c87-bd87-6aaf843e0c78-hosts-file\") pod \"node-resolver-bhmsc\" (UID: \"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\") " pod="openshift-dns/node-resolver-bhmsc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.642636 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d1ec526b-6fb0-4c87-bd87-6aaf843e0c78-hosts-file\") pod \"node-resolver-bhmsc\" (UID: \"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\") " pod="openshift-dns/node-resolver-bhmsc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.659983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvs85\" (UniqueName: \"kubernetes.io/projected/d1ec526b-6fb0-4c87-bd87-6aaf843e0c78-kube-api-access-kvs85\") pod \"node-resolver-bhmsc\" (UID: \"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\") " pod="openshift-dns/node-resolver-bhmsc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.784421 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bhmsc" Nov 27 16:04:06 crc kubenswrapper[4707]: W1127 16:04:06.799482 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ec526b_6fb0_4c87_bd87_6aaf843e0c78.slice/crio-cb9c3b69aa23d4dab8d5c899b5bae6da1314fc1b02159c757efc3ce3aa974d1c WatchSource:0}: Error finding container cb9c3b69aa23d4dab8d5c899b5bae6da1314fc1b02159c757efc3ce3aa974d1c: Status 404 returned error can't find the container with id cb9c3b69aa23d4dab8d5c899b5bae6da1314fc1b02159c757efc3ce3aa974d1c Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.843659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.843876 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:04:08.843850233 +0000 UTC m=+24.475299001 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.854209 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.859129 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-c995m"] Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.859537 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.860515 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9c4xg"] Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.860944 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.863197 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.863979 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.864299 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.864466 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.864605 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.864745 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.865058 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.865816 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xkmt7"] Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.866896 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.880824 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.880858 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.881224 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.881959 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.882324 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.882665 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.882994 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-js6mm"] Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.883613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.884470 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.886451 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.886669 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.886729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.887765 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.889065 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.891302 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.891327 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.891542 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.905300 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.924248 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.940048 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.944931 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-cni-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ca48c08-f39d-41a2-847a-c893a2111492-cni-binary-copy\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945052 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-hostroot\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a83beb0d-8dd1-434a-ace2-933f98e3956f-mcd-auth-proxy-config\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-socket-dir-parent\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-kubelet\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945209 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-os-release\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xtzp\" (UniqueName: \"kubernetes.io/projected/a83beb0d-8dd1-434a-ace2-933f98e3956f-kube-api-access-5xtzp\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-os-release\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.945881 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a83beb0d-8dd1-434a-ace2-933f98e3956f-proxy-tls\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hwzm\" (UniqueName: \"kubernetes.io/projected/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-kube-api-access-7hwzm\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946226 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-conf-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ca48c08-f39d-41a2-847a-c893a2111492-multus-daemon-config\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-cnibin\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-k8s-cni-cncf-io\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cnibin\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946655 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-system-cni-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946735 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-system-cni-dir\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946766 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cni-binary-copy\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjt58\" (UniqueName: \"kubernetes.io/projected/9ca48c08-f39d-41a2-847a-c893a2111492-kube-api-access-bjt58\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946816 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a83beb0d-8dd1-434a-ace2-933f98e3956f-rootfs\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946840 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-cni-multus\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-netns\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946919 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-multus-certs\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.946977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-cni-bin\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.946971 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.947002 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-etc-kubernetes\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947214 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947215 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947294 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947345 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947298 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:08.947276504 +0000 UTC m=+24.578725282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947400 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:08.947374096 +0000 UTC m=+24.578822874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947491 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:08.947430198 +0000 UTC m=+24.578879206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947680 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947700 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947734 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:06 crc kubenswrapper[4707]: E1127 16:04:06.947769 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:08.947759146 +0000 UTC m=+24.579207924 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.954579 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.968399 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.983665 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:06 crc kubenswrapper[4707]: I1127 16:04:06.996042 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.013267 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.028599 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047264 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-system-cni-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047555 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-systemd\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-system-cni-dir\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cni-binary-copy\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjt58\" (UniqueName: \"kubernetes.io/projected/9ca48c08-f39d-41a2-847a-c893a2111492-kube-api-access-bjt58\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047653 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-netns\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-system-cni-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a83beb0d-8dd1-434a-ace2-933f98e3956f-rootfs\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-cni-multus\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047752 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a83beb0d-8dd1-434a-ace2-933f98e3956f-rootfs\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-system-cni-dir\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047824 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-cni-multus\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047807 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-netns\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047789 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-netns\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-multus-certs\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-cni-bin\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-etc-kubernetes\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-kubelet\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-cni-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.047996 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-cni-bin\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ca48c08-f39d-41a2-847a-c893a2111492-cni-binary-copy\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048030 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-multus-certs\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-hostroot\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-etc-kubernetes\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048061 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xmv\" (UniqueName: \"kubernetes.io/projected/55af9c67-18ce-46f1-a761-d11ce16f42d6-kube-api-access-p6xmv\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048097 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-systemd-units\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048120 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-slash\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a83beb0d-8dd1-434a-ace2-933f98e3956f-mcd-auth-proxy-config\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-socket-dir-parent\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-kubelet\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-os-release\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048294 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-script-lib\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xtzp\" (UniqueName: \"kubernetes.io/projected/a83beb0d-8dd1-434a-ace2-933f98e3956f-kube-api-access-5xtzp\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-os-release\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-log-socket\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048397 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a83beb0d-8dd1-434a-ace2-933f98e3956f-proxy-tls\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048420 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hwzm\" (UniqueName: \"kubernetes.io/projected/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-kube-api-access-7hwzm\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-node-log\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048458 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-conf-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cni-binary-copy\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ca48c08-f39d-41a2-847a-c893a2111492-multus-daemon-config\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-var-lib-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-netd\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-cni-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-cnibin\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048650 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-bin\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-cnibin\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-config\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ca48c08-f39d-41a2-847a-c893a2111492-cni-binary-copy\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-var-lib-kubelet\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-os-release\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048912 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-hostroot\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-socket-dir-parent\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovn-node-metrics-cert\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.048985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-k8s-cni-cncf-io\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-os-release\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-ovn\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049125 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-env-overrides\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-host-run-k8s-cni-cncf-io\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049177 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cnibin\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ca48c08-f39d-41a2-847a-c893a2111492-multus-daemon-config\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049245 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cnibin\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-etc-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049307 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ca48c08-f39d-41a2-847a-c893a2111492-multus-conf-dir\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a83beb0d-8dd1-434a-ace2-933f98e3956f-mcd-auth-proxy-config\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.049864 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.052945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a83beb0d-8dd1-434a-ace2-933f98e3956f-proxy-tls\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.068930 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.069334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hwzm\" (UniqueName: \"kubernetes.io/projected/37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7-kube-api-access-7hwzm\") pod \"multus-additional-cni-plugins-9c4xg\" (UID: \"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\") " pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.070557 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjt58\" (UniqueName: \"kubernetes.io/projected/9ca48c08-f39d-41a2-847a-c893a2111492-kube-api-access-bjt58\") pod \"multus-js6mm\" (UID: \"9ca48c08-f39d-41a2-847a-c893a2111492\") " pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.073932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xtzp\" (UniqueName: \"kubernetes.io/projected/a83beb0d-8dd1-434a-ace2-933f98e3956f-kube-api-access-5xtzp\") pod \"machine-config-daemon-c995m\" (UID: \"a83beb0d-8dd1-434a-ace2-933f98e3956f\") " pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.090769 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.106025 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.122794 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.138020 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.149911 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-script-lib\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.149945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-node-log\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.149963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-log-socket\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.149990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-var-lib-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-netd\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-bin\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-config\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150095 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovn-node-metrics-cert\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150156 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-etc-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-ovn\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-env-overrides\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-systemd\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150255 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-netns\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150279 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150320 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-kubelet\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150346 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-systemd-units\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-slash\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xmv\" (UniqueName: \"kubernetes.io/projected/55af9c67-18ce-46f1-a761-d11ce16f42d6-kube-api-access-p6xmv\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-node-log\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-log-socket\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-var-lib-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150816 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-netd\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150840 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-bin\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-script-lib\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-netns\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-kubelet\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-systemd-units\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.151005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-slash\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.151027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-etc-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.151341 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-config\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.151414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-ovn\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.150884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-systemd\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.151610 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-openvswitch\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.151742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-env-overrides\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.154286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovn-node-metrics-cert\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.154278 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.167224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xmv\" (UniqueName: \"kubernetes.io/projected/55af9c67-18ce-46f1-a761-d11ce16f42d6-kube-api-access-p6xmv\") pod \"ovnkube-node-xkmt7\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.169555 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.181546 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.195099 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.195220 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.195284 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:07 crc kubenswrapper[4707]: E1127 16:04:07.195401 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:07 crc kubenswrapper[4707]: E1127 16:04:07.195502 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.195864 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:07 crc kubenswrapper[4707]: E1127 16:04:07.196016 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.206684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.208057 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.209628 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.210733 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.211595 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.213158 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.214144 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.214461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.222028 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.226234 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.222563 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-js6mm" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.228156 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.230116 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.230917 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.238784 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.240645 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.242197 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.242835 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.245715 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.246341 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.247434 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.248175 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.248931 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.249617 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.250597 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.251236 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.251808 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.252490 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.253057 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.253604 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: W1127 16:04:07.255790 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55af9c67_18ce_46f1_a761_d11ce16f42d6.slice/crio-eb31900fe2142123c7385fe1e5b81150e1ae53282a92d68ddfb6be40f9dc3e46 WatchSource:0}: Error finding container eb31900fe2142123c7385fe1e5b81150e1ae53282a92d68ddfb6be40f9dc3e46: Status 404 returned error can't find the container with id eb31900fe2142123c7385fe1e5b81150e1ae53282a92d68ddfb6be40f9dc3e46 Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.257566 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.258343 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.259502 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.262186 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.263522 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.264251 4707 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.264438 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.266269 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.267339 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.267957 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.269578 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.270704 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.275209 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.276349 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.277653 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.278242 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.279476 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.280225 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.281499 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.282091 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.283116 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.283728 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.286552 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.287106 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.288223 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.288913 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.289537 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.292360 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.293655 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.408597 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.408663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"eb31900fe2142123c7385fe1e5b81150e1ae53282a92d68ddfb6be40f9dc3e46"} Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.411739 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerStarted","Data":"5fab0e5e8a29c2028defc06ef5b1249c338b6800a89c8b050ae93d654b5f131b"} Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.416523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5"} Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.416561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"9d321f78fce71dfb0f720ddbb56193cb6a04efb1d2783707e5d8e3e945e348e6"} Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.420337 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bhmsc" event={"ID":"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78","Type":"ContainerStarted","Data":"f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3"} Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.420410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bhmsc" event={"ID":"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78","Type":"ContainerStarted","Data":"cb9c3b69aa23d4dab8d5c899b5bae6da1314fc1b02159c757efc3ce3aa974d1c"} Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.421661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-js6mm" event={"ID":"9ca48c08-f39d-41a2-847a-c893a2111492","Type":"ContainerStarted","Data":"a86c7a94ac1bd31c235cbdcd472ba2631670df94cba103697a944079bb930ccf"} Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.435519 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.462469 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.483175 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.502442 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.521088 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.531058 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.548132 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.568969 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.592969 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.617625 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.629288 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.643207 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.664762 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.682240 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.696749 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.708485 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.725044 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.741544 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.755930 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.772142 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.787737 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.801400 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.822256 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.842491 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.861698 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:07 crc kubenswrapper[4707]: I1127 16:04:07.877768 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.325805 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.340638 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.343112 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.343548 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.358040 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.373107 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.421963 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.435012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-js6mm" event={"ID":"9ca48c08-f39d-41a2-847a-c893a2111492","Type":"ContainerStarted","Data":"be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.439122 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d" exitCode=0 Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.439162 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.439224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.439235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.439245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.439255 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.439265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.439276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.448435 4707 generic.go:334] "Generic (PLEG): container finished" podID="37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7" containerID="8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12" exitCode=0 Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.448550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerDied","Data":"8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.454455 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.457035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723"} Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.464907 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.485425 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.506206 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.528604 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.543832 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.558777 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.576803 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.596451 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.608096 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.622423 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.635252 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.645683 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.661144 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.681775 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.697074 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.712035 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.727693 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.742609 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.755572 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.776595 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.791113 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.805942 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.821346 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:08Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.875219 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.875487 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:04:12.875448633 +0000 UTC m=+28.506897601 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.976224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.976291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.976317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:08 crc kubenswrapper[4707]: I1127 16:04:08.976354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976463 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976491 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976563 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:12.976543015 +0000 UTC m=+28.607991783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976566 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976520 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976601 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976589 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976617 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976621 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976642 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:12.976600307 +0000 UTC m=+28.608049125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976686 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:12.976670158 +0000 UTC m=+28.608118926 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:08 crc kubenswrapper[4707]: E1127 16:04:08.976709 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:12.976701199 +0000 UTC m=+28.608149967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.195098 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.195208 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:09 crc kubenswrapper[4707]: E1127 16:04:09.195240 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:09 crc kubenswrapper[4707]: E1127 16:04:09.195361 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.195503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:09 crc kubenswrapper[4707]: E1127 16:04:09.195790 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.474015 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerStarted","Data":"4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64"} Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.502969 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.525004 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.540753 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.555126 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.570863 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.583742 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.593607 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.608310 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.637344 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.651794 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.667152 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.681396 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.694036 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:09 crc kubenswrapper[4707]: I1127 16:04:09.716734 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:09Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.481291 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.483416 4707 generic.go:334] "Generic (PLEG): container finished" podID="37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7" containerID="4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64" exitCode=0 Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.483454 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerDied","Data":"4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64"} Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.497906 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.518713 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.542688 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.556623 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.568260 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.582535 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.596818 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.622484 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.646212 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.664212 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.677066 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.691287 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.704513 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:10 crc kubenswrapper[4707]: I1127 16:04:10.715746 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:10Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.194469 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.195072 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.194482 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.195190 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.194482 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.195276 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.484153 4707 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.487087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.487164 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.487183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.487474 4707 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.493676 4707 generic.go:334] "Generic (PLEG): container finished" podID="37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7" containerID="19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8" exitCode=0 Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.493737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerDied","Data":"19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.500548 4707 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.502093 4707 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.504032 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.504075 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.504095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.504124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.504147 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.516576 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.530049 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.536236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.536318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.536340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.536396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.536417 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.538028 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.553777 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.558385 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.563885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.563924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.563955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.563978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.563995 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.571075 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.582137 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.588368 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.588504 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.588564 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.588647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.588718 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.594772 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.603787 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.608200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.608254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.608270 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.608287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.608300 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.609915 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.621067 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: E1127 16:04:11.621181 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.622887 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.622912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.622920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.622936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.622947 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.625061 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.638695 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.652947 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.667101 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.688489 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.703419 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.718024 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.726438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.726471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.726485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.726507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.726520 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.731705 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:11Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.829938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.830004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.830024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.830051 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.830068 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.932749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.933053 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.933117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.933183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:11 crc kubenswrapper[4707]: I1127 16:04:11.933240 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:11Z","lastTransitionTime":"2025-11-27T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.036612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.036992 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.037083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.037181 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.037274 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.140682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.140798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.140814 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.140839 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.140860 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.244534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.244601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.244618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.244645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.244681 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.361622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.361700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.361723 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.361749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.361766 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.464839 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.465141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.465149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.465166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.465176 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.507039 4707 generic.go:334] "Generic (PLEG): container finished" podID="37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7" containerID="db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b" exitCode=0 Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.507089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerDied","Data":"db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.529422 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.545092 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.566142 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.579854 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.579918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.579935 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.579961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.579979 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.586768 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.606622 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.622812 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.644231 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.678801 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.686091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.686149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.686170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.686201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.686220 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.699749 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.719927 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.749121 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.769394 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.783690 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.789107 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.789177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.789198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.789226 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.789246 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.798836 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:12Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.892142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.892200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.892210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.892233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.892246 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.918598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:12 crc kubenswrapper[4707]: E1127 16:04:12.918872 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:04:20.918840208 +0000 UTC m=+36.550289016 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.995888 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.995956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.995973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.995998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:12 crc kubenswrapper[4707]: I1127 16:04:12.996016 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:12Z","lastTransitionTime":"2025-11-27T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.020020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.020086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.020131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.020165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.020290 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.020358 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:21.0203363 +0000 UTC m=+36.651785108 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.020861 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.020893 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.020915 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.020963 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:21.020948586 +0000 UTC m=+36.652397394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.021045 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.021084 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:21.021072129 +0000 UTC m=+36.652520937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.021162 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.021179 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.021193 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.021229 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:21.021217482 +0000 UTC m=+36.652666280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.099719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.099790 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.099810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.099838 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.099855 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.195049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.195175 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.195304 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.195449 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.195762 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:13 crc kubenswrapper[4707]: E1127 16:04:13.195891 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.202745 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.202829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.202864 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.202899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.202923 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.306895 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.307020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.307045 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.307083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.307111 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.411639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.411712 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.411731 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.411758 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.411781 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.517521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.517586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.517606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.517634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.517656 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.526901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.527299 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.527360 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.527404 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.538416 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerStarted","Data":"21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.547646 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.566961 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.570998 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.580610 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.584194 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.603532 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.626379 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.626439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.626454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.626478 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.626495 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.629398 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.648345 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.669631 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.689719 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.708489 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.730208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.730265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.730276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.730296 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.730310 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.730839 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.752354 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.771830 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.804993 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.829853 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.832798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.832841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.832859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.832890 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.832906 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.852548 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.867306 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.902276 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.923904 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.936251 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.936531 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.936622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.936719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.936798 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:13Z","lastTransitionTime":"2025-11-27T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.955544 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.974714 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:13 crc kubenswrapper[4707]: I1127 16:04:13.997932 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.017513 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.035059 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.039968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.040017 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.040077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.040100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.040112 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.053219 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.072725 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.086465 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.101261 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.131641 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.143428 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.143487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.143504 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.143534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.143551 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.246273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.246334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.246350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.246400 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.246419 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.349756 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.349859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.349909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.349941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.349964 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.454146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.454224 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.454247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.454310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.454333 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.555683 4707 generic.go:334] "Generic (PLEG): container finished" podID="37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7" containerID="21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252" exitCode=0 Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.555854 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerDied","Data":"21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.557554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.557603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.557622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.557649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.557669 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.584304 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.613123 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.630875 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.653228 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.664270 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.664512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.664528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.664550 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.664563 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.688060 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.702035 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.716696 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.732078 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.745004 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.767018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.767062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.767074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.767096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.767110 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.770224 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.791743 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.805127 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.821923 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.839722 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:14Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.870389 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.870445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.870457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.870474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.870484 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.973965 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.974048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.974073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.974106 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:14 crc kubenswrapper[4707]: I1127 16:04:14.974127 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:14Z","lastTransitionTime":"2025-11-27T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.077115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.077408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.077488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.077565 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.077653 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.179980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.180254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.180309 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.180381 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.180440 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.196439 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:15 crc kubenswrapper[4707]: E1127 16:04:15.196690 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.196763 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.196587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:15 crc kubenswrapper[4707]: E1127 16:04:15.196943 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:15 crc kubenswrapper[4707]: E1127 16:04:15.197193 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.213211 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.229298 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.244956 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.264405 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.283352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.283421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.283434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.283454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.283468 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.292854 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.314119 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.335711 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.368672 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.386951 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.387003 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.387013 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.387030 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.386932 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.387044 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.403602 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.415434 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.432613 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.443491 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.464162 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.490699 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.490762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.490786 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.490813 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.490832 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.566274 4707 generic.go:334] "Generic (PLEG): container finished" podID="37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7" containerID="851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9" exitCode=0 Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.566440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerDied","Data":"851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.580174 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.593877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.593953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.593972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.594000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.594023 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.597206 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.614340 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.636997 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.653880 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.674306 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.700234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.700280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.700293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.700310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.700322 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.704794 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.705168 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.724087 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.744337 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.762916 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.778981 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.803654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.803716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.803732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.803756 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.803772 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.805642 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.826056 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.838934 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.872933 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.894912 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.906151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.906205 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.906219 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.906240 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.906255 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:15Z","lastTransitionTime":"2025-11-27T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.910565 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.924258 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.937151 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.949027 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.962803 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:15 crc kubenswrapper[4707]: I1127 16:04:15.988122 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.004124 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.009676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.009709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.009718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.009732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.009748 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.019717 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.037711 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.055529 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.075502 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.111786 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.113087 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.113125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.113139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.113158 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.113174 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.216417 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.216462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.216473 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.216494 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.216505 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.320062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.320138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.320157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.320183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.320205 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.424020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.424085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.424109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.424158 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.424180 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.527150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.527235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.527253 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.527282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.527300 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.577107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" event={"ID":"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7","Type":"ContainerStarted","Data":"66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.579780 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/0.log" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.585630 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a" exitCode=1 Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.585689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.586763 4707 scope.go:117] "RemoveContainer" containerID="dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.607316 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.631045 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.631726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.631777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.631795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.631821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.631840 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.644020 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.660409 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.700888 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.722769 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.735218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.735290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.735307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.735336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.735359 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.743512 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.774684 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.797579 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.817333 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.835508 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.837895 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.837951 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.837966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.837997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.838015 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.857209 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.878928 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.897728 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.921867 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.938980 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.942505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.942578 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.942596 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.942619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.942634 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:16Z","lastTransitionTime":"2025-11-27T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.974725 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:16 crc kubenswrapper[4707]: I1127 16:04:16.999078 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:16Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.021767 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.041308 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.046264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.046489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.046591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.046701 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.046780 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.062073 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.085158 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.102572 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.135244 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"message\\\":\\\"-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:04:15.866832 5913 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:04:15.866884 5913 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:04:15.866938 5913 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:15.866942 5913 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1127 16:04:15.866969 5913 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:04:15.866990 5913 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:15.866997 5913 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:15.866998 5913 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1127 16:04:15.867030 5913 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1127 16:04:15.867049 5913 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:15.867059 5913 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:04:15.867068 5913 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:15.871516 5913 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:15.871590 5913 factory.go:656] Stopping watch factory\\\\nI1127 16:04:15.871610 5913 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:04:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.150347 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.150477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.150501 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.150536 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.150556 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.153103 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.172141 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.188325 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.194491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.194593 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:17 crc kubenswrapper[4707]: E1127 16:04:17.195122 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.194672 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:17 crc kubenswrapper[4707]: E1127 16:04:17.195342 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:17 crc kubenswrapper[4707]: E1127 16:04:17.195138 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.207673 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.254202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.254589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.254747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.254937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.255068 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.359754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.359806 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.359817 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.359836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.359857 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.463284 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.463338 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.463357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.463412 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.463429 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.566311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.566414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.566433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.566461 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.566481 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.593687 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/0.log" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.597178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.598307 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.633854 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.661571 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.669955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.670030 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.670229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.670258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.670544 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.674971 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.699404 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"message\\\":\\\"-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:04:15.866832 5913 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:04:15.866884 5913 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:04:15.866938 5913 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:15.866942 5913 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1127 16:04:15.866969 5913 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:04:15.866990 5913 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:15.866997 5913 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:15.866998 5913 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1127 16:04:15.867030 5913 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1127 16:04:15.867049 5913 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:15.867059 5913 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:04:15.867068 5913 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:15.871516 5913 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:15.871590 5913 factory.go:656] Stopping watch factory\\\\nI1127 16:04:15.871610 5913 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:04:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.717950 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.734798 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.751247 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.772865 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.773267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.773344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.773404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.773452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.773477 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.792867 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.811577 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.835277 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.855999 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.874114 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.875856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.875892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.875902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.875916 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.875927 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.886858 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:17Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.978312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.978416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.978436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.978466 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:17 crc kubenswrapper[4707]: I1127 16:04:17.978492 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:17Z","lastTransitionTime":"2025-11-27T16:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.081903 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.081957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.081969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.081989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.082002 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.185275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.185331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.185341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.185362 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.185391 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.288676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.288745 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.288759 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.288781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.288798 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.392020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.392096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.392116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.392152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.392182 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.496106 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.496200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.496218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.496241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.496259 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.600014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.600120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.600146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.600178 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.600203 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.604567 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/1.log" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.605998 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/0.log" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.610476 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d" exitCode=1 Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.610531 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.610610 4707 scope.go:117] "RemoveContainer" containerID="dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.611883 4707 scope.go:117] "RemoveContainer" containerID="def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d" Nov 27 16:04:18 crc kubenswrapper[4707]: E1127 16:04:18.612243 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.632683 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.649436 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.687418 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.703488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.703554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.703574 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.703601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.703622 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.710547 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.730931 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.750474 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.769145 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.791975 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.806078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.806115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.806127 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.806147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.806161 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.810872 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.828294 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.852692 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.871257 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.895787 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.909255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.909313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.909332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.909359 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.909414 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:18Z","lastTransitionTime":"2025-11-27T16:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:18 crc kubenswrapper[4707]: I1127 16:04:18.922164 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"message\\\":\\\"-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:04:15.866832 5913 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:04:15.866884 5913 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:04:15.866938 5913 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:15.866942 5913 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1127 16:04:15.866969 5913 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:04:15.866990 5913 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:15.866997 5913 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:15.866998 5913 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1127 16:04:15.867030 5913 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1127 16:04:15.867049 5913 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:15.867059 5913 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:04:15.867068 5913 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:15.871516 5913 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:15.871590 5913 factory.go:656] Stopping watch factory\\\\nI1127 16:04:15.871610 5913 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:04:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:18Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.012968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.013055 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.013084 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.013121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.013142 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.117942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.118035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.118057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.118094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.118116 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.194504 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:19 crc kubenswrapper[4707]: E1127 16:04:19.194704 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.194792 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.194845 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:19 crc kubenswrapper[4707]: E1127 16:04:19.194998 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:19 crc kubenswrapper[4707]: E1127 16:04:19.195131 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.222327 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488"] Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.222749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.222861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.222922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.222995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.223022 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.223151 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.227675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.227715 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.250922 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.270751 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.297148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/433a450e-2371-4893-b21e-19707b40e28f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.297205 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52jnc\" (UniqueName: \"kubernetes.io/projected/433a450e-2371-4893-b21e-19707b40e28f-kube-api-access-52jnc\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.297227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/433a450e-2371-4893-b21e-19707b40e28f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.297270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/433a450e-2371-4893-b21e-19707b40e28f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.311190 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.327494 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.327576 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.327600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.327631 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.327653 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.340825 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.363353 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.384912 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.398928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/433a450e-2371-4893-b21e-19707b40e28f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.399083 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/433a450e-2371-4893-b21e-19707b40e28f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.399130 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52jnc\" (UniqueName: \"kubernetes.io/projected/433a450e-2371-4893-b21e-19707b40e28f-kube-api-access-52jnc\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.399175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/433a450e-2371-4893-b21e-19707b40e28f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.399946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/433a450e-2371-4893-b21e-19707b40e28f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.400995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/433a450e-2371-4893-b21e-19707b40e28f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.406315 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.410771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/433a450e-2371-4893-b21e-19707b40e28f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.428846 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.430755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.430830 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.430849 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.431356 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.431441 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.434309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52jnc\" (UniqueName: \"kubernetes.io/projected/433a450e-2371-4893-b21e-19707b40e28f-kube-api-access-52jnc\") pod \"ovnkube-control-plane-749d76644c-5x488\" (UID: \"433a450e-2371-4893-b21e-19707b40e28f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.452432 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.474761 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.495978 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.516399 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.534941 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.535222 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.535353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.535534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.535655 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.542799 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.546089 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" Nov 27 16:04:19 crc kubenswrapper[4707]: W1127 16:04:19.573294 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod433a450e_2371_4893_b21e_19707b40e28f.slice/crio-0c796594eb58ba483bdce41e1dd1a4a66822627503061434148e52d78115e32d WatchSource:0}: Error finding container 0c796594eb58ba483bdce41e1dd1a4a66822627503061434148e52d78115e32d: Status 404 returned error can't find the container with id 0c796594eb58ba483bdce41e1dd1a4a66822627503061434148e52d78115e32d Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.582407 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb7a8d9915fa6019298b207c3f4bf09fd27323f12d5dc7930f1efcc2371a11a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"message\\\":\\\"-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 16:04:15.866832 5913 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1127 16:04:15.866884 5913 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1127 16:04:15.866938 5913 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:15.866942 5913 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1127 16:04:15.866969 5913 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1127 16:04:15.866990 5913 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:15.866997 5913 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:15.866998 5913 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1127 16:04:15.867030 5913 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1127 16:04:15.867049 5913 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:15.867059 5913 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1127 16:04:15.867068 5913 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:15.871516 5913 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:15.871590 5913 factory.go:656] Stopping watch factory\\\\nI1127 16:04:15.871610 5913 ovnkube.go:599] Stopped ovnkube\\\\nI1127 16:04:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.600661 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.624250 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" event={"ID":"433a450e-2371-4893-b21e-19707b40e28f","Type":"ContainerStarted","Data":"0c796594eb58ba483bdce41e1dd1a4a66822627503061434148e52d78115e32d"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.632244 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/1.log" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.639478 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.639549 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.639563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.639580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.639593 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.640886 4707 scope.go:117] "RemoveContainer" containerID="def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d" Nov 27 16:04:19 crc kubenswrapper[4707]: E1127 16:04:19.641282 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.668366 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.685534 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.708021 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.729615 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.742724 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.742763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.742774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.742793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.742807 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.745034 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.760562 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.774673 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.786908 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.800742 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.823073 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.842845 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.846721 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.846765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.846776 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.846795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.846808 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.859874 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.874636 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.890614 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.905700 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:19Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.949344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.949420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.949432 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.949455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:19 crc kubenswrapper[4707]: I1127 16:04:19.949471 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:19Z","lastTransitionTime":"2025-11-27T16:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.053259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.053341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.053362 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.053418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.053438 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.156416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.156481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.156498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.156525 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.156553 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.259536 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.259600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.259619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.259645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.259667 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.362070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.362125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.362146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.362173 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.362189 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.464963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.465058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.465086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.465128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.465156 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.569313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.569419 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.569447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.569476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.569499 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.645957 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" event={"ID":"433a450e-2371-4893-b21e-19707b40e28f","Type":"ContainerStarted","Data":"35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.646023 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" event={"ID":"433a450e-2371-4893-b21e-19707b40e28f","Type":"ContainerStarted","Data":"12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.672454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.672725 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.673096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.673120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.673134 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.674168 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.692000 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.705347 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.736280 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.753605 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.762482 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pdngz"] Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.763082 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.765248 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.765778 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.766222 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.766452 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.775661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.775792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.775856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.775923 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.775999 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.777050 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.799101 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.819581 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a0b58a3-9ac2-4446-b507-88eba42aa060-host\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.819743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sq6r\" (UniqueName: \"kubernetes.io/projected/5a0b58a3-9ac2-4446-b507-88eba42aa060-kube-api-access-6sq6r\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.819885 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a0b58a3-9ac2-4446-b507-88eba42aa060-serviceca\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.823989 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.841354 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.862818 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.879086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.879165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.879186 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.879217 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.879237 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.884184 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.920994 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.921580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a0b58a3-9ac2-4446-b507-88eba42aa060-host\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: E1127 16:04:20.921721 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:04:36.921679743 +0000 UTC m=+52.553128591 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.921765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a0b58a3-9ac2-4446-b507-88eba42aa060-host\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.922029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sq6r\" (UniqueName: \"kubernetes.io/projected/5a0b58a3-9ac2-4446-b507-88eba42aa060-kube-api-access-6sq6r\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.922263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a0b58a3-9ac2-4446-b507-88eba42aa060-serviceca\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.923669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a0b58a3-9ac2-4446-b507-88eba42aa060-serviceca\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.930270 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.949980 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.953168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sq6r\" (UniqueName: \"kubernetes.io/projected/5a0b58a3-9ac2-4446-b507-88eba42aa060-kube-api-access-6sq6r\") pod \"node-ca-pdngz\" (UID: \"5a0b58a3-9ac2-4446-b507-88eba42aa060\") " pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.975727 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.982012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.982047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.982056 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.982073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.982084 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:20Z","lastTransitionTime":"2025-11-27T16:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:20 crc kubenswrapper[4707]: I1127 16:04:20.992898 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:20Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.024049 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.024134 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.024180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.024219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.024339 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.024453 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:37.024427096 +0000 UTC m=+52.655875894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.024787 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.024951 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:37.024915719 +0000 UTC m=+52.656364527 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.024966 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.025063 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.025142 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.025009 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.025206 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.025228 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.025328 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:37.025260277 +0000 UTC m=+52.656709325 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.025434 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:37.025353149 +0000 UTC m=+52.656802217 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.026841 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.050572 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.072533 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.078055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pdngz" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.086047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.086097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.086113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.086138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.086160 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.095301 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: W1127 16:04:21.098725 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0b58a3_9ac2_4446_b507_88eba42aa060.slice/crio-3c9fce4f3896bd262ab251dc965b5720dfb81e0932ae7f06c1f48c26eccdf333 WatchSource:0}: Error finding container 3c9fce4f3896bd262ab251dc965b5720dfb81e0932ae7f06c1f48c26eccdf333: Status 404 returned error can't find the container with id 3c9fce4f3896bd262ab251dc965b5720dfb81e0932ae7f06c1f48c26eccdf333 Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.118481 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.137531 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.155475 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.171731 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.189874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.189945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.189963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.189991 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.190038 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.198480 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.198651 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.198736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.198796 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.199057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.199383 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.199060 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.228932 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.248006 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.267279 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.287697 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.293124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.293169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.293182 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.293203 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.293217 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.302907 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.337832 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.379471 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.396554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.396604 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.396614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.396634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.396645 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.499459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.499529 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.499548 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.499574 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.499593 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.505507 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qcl5k"] Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.505972 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.506043 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.524159 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.541430 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.568926 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.591057 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.602750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.602837 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.602855 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.602882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.602902 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.614061 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.628123 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.631518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.631584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2krv\" (UniqueName: \"kubernetes.io/projected/7d382481-3c1e-49ed-8e27-265d495aa776-kube-api-access-j2krv\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.644278 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.651279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pdngz" event={"ID":"5a0b58a3-9ac2-4446-b507-88eba42aa060","Type":"ContainerStarted","Data":"feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.651419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pdngz" event={"ID":"5a0b58a3-9ac2-4446-b507-88eba42aa060","Type":"ContainerStarted","Data":"3c9fce4f3896bd262ab251dc965b5720dfb81e0932ae7f06c1f48c26eccdf333"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.661938 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.729471 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.731405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.731478 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.731494 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.731519 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.731539 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.732592 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.732673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2krv\" (UniqueName: \"kubernetes.io/projected/7d382481-3c1e-49ed-8e27-265d495aa776-kube-api-access-j2krv\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.733525 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.733590 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs podName:7d382481-3c1e-49ed-8e27-265d495aa776 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:22.23357033 +0000 UTC m=+37.865019108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs") pod "network-metrics-daemon-qcl5k" (UID: "7d382481-3c1e-49ed-8e27-265d495aa776") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.747098 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.755537 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2krv\" (UniqueName: \"kubernetes.io/projected/7d382481-3c1e-49ed-8e27-265d495aa776-kube-api-access-j2krv\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.768524 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.787882 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.806713 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.828763 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.837058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.837109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.837123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.837145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.837159 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.850087 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.868272 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.883015 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.902697 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.923461 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.939455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.939490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.939499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.939515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.939528 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.943450 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.962008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.962075 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.962102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.962144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.962170 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.963398 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: E1127 16:04:21.979391 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.984119 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.984185 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.984203 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.984231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.984251 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:21Z","lastTransitionTime":"2025-11-27T16:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:21 crc kubenswrapper[4707]: I1127 16:04:21.994147 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:21Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: E1127 16:04:22.003293 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.009413 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.010426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.010462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.010475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.010495 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.010509 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.025809 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: E1127 16:04:22.031677 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.040104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.040159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.040177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.040207 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.040226 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.044234 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: E1127 16:04:22.057186 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.062078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.062126 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.062139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.062160 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.062172 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.062544 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: E1127 16:04:22.078008 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: E1127 16:04:22.078325 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.083464 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.087958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.088034 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.088058 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.088093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.088117 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.114043 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.137404 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.157334 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.177520 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.191569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.191615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.191624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.191643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.191654 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.199425 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.221727 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.239349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:22 crc kubenswrapper[4707]: E1127 16:04:22.239547 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:22 crc kubenswrapper[4707]: E1127 16:04:22.239641 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs podName:7d382481-3c1e-49ed-8e27-265d495aa776 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:23.239620177 +0000 UTC m=+38.871068945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs") pod "network-metrics-daemon-qcl5k" (UID: "7d382481-3c1e-49ed-8e27-265d495aa776") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.240041 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:22Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.294100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.294635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.294658 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.294688 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.294710 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.398834 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.398905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.398921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.398950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.398966 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.502805 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.502866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.502883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.502912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.502932 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.607074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.607146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.607168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.607195 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.607216 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.710918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.710983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.711001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.711028 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.711051 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.814660 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.814740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.814760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.814789 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.814823 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.918481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.918540 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.918551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.918580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:22 crc kubenswrapper[4707]: I1127 16:04:22.918593 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:22Z","lastTransitionTime":"2025-11-27T16:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.021118 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.021168 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.021181 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.021214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.021226 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.125322 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.125466 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.125489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.125517 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.125541 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.195009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.195071 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:23 crc kubenswrapper[4707]: E1127 16:04:23.195243 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.195314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.195511 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:23 crc kubenswrapper[4707]: E1127 16:04:23.195789 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:23 crc kubenswrapper[4707]: E1127 16:04:23.196533 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:23 crc kubenswrapper[4707]: E1127 16:04:23.196643 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.229330 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.229437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.229464 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.229546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.229571 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.250573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:23 crc kubenswrapper[4707]: E1127 16:04:23.250855 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:23 crc kubenswrapper[4707]: E1127 16:04:23.250970 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs podName:7d382481-3c1e-49ed-8e27-265d495aa776 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:25.25093498 +0000 UTC m=+40.882383788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs") pod "network-metrics-daemon-qcl5k" (UID: "7d382481-3c1e-49ed-8e27-265d495aa776") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.332957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.333033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.333052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.333079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.333098 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.436237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.436348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.436394 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.436423 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.436443 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.540499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.540595 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.540615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.540640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.540658 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.644482 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.644561 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.644580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.644607 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.644630 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.747273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.748329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.748530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.748704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.748860 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.852486 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.852547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.852563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.852591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.852613 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.956482 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.956556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.956574 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.956601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:23 crc kubenswrapper[4707]: I1127 16:04:23.956619 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:23Z","lastTransitionTime":"2025-11-27T16:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.060698 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.060749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.060765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.060789 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.060805 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.164146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.164202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.164214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.164236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.164249 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.266864 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.266915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.266926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.266954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.266966 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.369591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.369905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.369983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.370077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.370150 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.473401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.473429 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.473439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.473456 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.473467 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.577063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.577165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.577189 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.577222 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.577243 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.680755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.681152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.681220 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.681531 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.681602 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.785753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.786596 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.786656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.786699 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.786738 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.890305 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.890618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.890639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.890663 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.890680 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.994262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.994331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.994352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.994419 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:24 crc kubenswrapper[4707]: I1127 16:04:24.994449 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:24Z","lastTransitionTime":"2025-11-27T16:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.098085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.098149 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.098165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.098229 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.098248 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.195360 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:25 crc kubenswrapper[4707]: E1127 16:04:25.195520 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.195584 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:25 crc kubenswrapper[4707]: E1127 16:04:25.195626 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.195892 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:25 crc kubenswrapper[4707]: E1127 16:04:25.195954 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.196088 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:25 crc kubenswrapper[4707]: E1127 16:04:25.196178 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.200398 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.200430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.200441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.200457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.200471 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.216005 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.228556 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.242324 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.264794 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.273534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:25 crc kubenswrapper[4707]: E1127 16:04:25.273802 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:25 crc kubenswrapper[4707]: E1127 16:04:25.273913 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs podName:7d382481-3c1e-49ed-8e27-265d495aa776 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:29.273887334 +0000 UTC m=+44.905336142 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs") pod "network-metrics-daemon-qcl5k" (UID: "7d382481-3c1e-49ed-8e27-265d495aa776") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.287173 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.304673 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.304776 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.304796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.304832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.304850 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.307252 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.335305 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.362334 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.387186 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.407607 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.411104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.411306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.411423 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.411537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.411635 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.427725 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.450295 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.474567 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.498229 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.515738 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.515791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.515810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.515837 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.515858 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.521478 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.543709 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.561545 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:25Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.620422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.620481 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.620492 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.620512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.620523 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.724324 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.724448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.724466 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.724493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.724511 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.828356 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.828456 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.828475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.828507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.828530 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.931753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.931832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.931851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.931882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:25 crc kubenswrapper[4707]: I1127 16:04:25.931903 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:25Z","lastTransitionTime":"2025-11-27T16:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.035158 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.035325 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.035353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.035416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.035443 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.138991 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.139074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.139097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.139134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.139157 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.241855 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.241936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.241955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.241984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.242005 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.344750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.344821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.344836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.344857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.344870 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.448153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.448213 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.448226 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.448254 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.448267 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.552444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.552512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.552532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.552559 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.552581 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.656248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.656350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.656399 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.656427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.656448 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.762221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.762313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.762331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.762358 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.762411 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.865596 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.865681 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.865709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.865741 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.865767 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.969553 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.969626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.969644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.969670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:26 crc kubenswrapper[4707]: I1127 16:04:26.969688 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:26Z","lastTransitionTime":"2025-11-27T16:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.073486 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.073536 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.073552 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.073576 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.073594 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.176969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.177044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.177067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.177103 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.177131 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.195134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.195154 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.195187 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.195236 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:27 crc kubenswrapper[4707]: E1127 16:04:27.195343 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:27 crc kubenswrapper[4707]: E1127 16:04:27.195534 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:27 crc kubenswrapper[4707]: E1127 16:04:27.195695 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:27 crc kubenswrapper[4707]: E1127 16:04:27.195861 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.280177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.280258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.280282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.280314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.280335 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.383921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.384348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.384626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.384800 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.384928 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.488432 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.488508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.488527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.488556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.488576 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.591892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.591987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.592017 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.592057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.592083 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.696083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.696797 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.696952 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.697117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.697252 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.801118 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.801192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.801216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.801249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.801272 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.905073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.905123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.905134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.905152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:27 crc kubenswrapper[4707]: I1127 16:04:27.905165 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:27Z","lastTransitionTime":"2025-11-27T16:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.008268 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.008366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.008431 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.008471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.008518 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.111879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.111954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.111978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.112006 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.112027 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.215747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.215802 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.215821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.215845 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.215864 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.318977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.319047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.319069 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.319098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.319119 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.422749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.422810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.422829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.422865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.422891 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.525694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.525760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.525779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.525806 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.525825 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.628860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.628924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.628942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.628971 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.628991 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.732801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.732864 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.732883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.732912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.732933 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.836904 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.836962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.836980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.837004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.837024 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.940145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.940212 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.940237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.940270 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:28 crc kubenswrapper[4707]: I1127 16:04:28.940294 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:28Z","lastTransitionTime":"2025-11-27T16:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.043405 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.043454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.043466 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.043484 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.043496 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.146332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.146737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.146969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.147131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.147252 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.194740 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.194775 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.194834 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.194933 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:29 crc kubenswrapper[4707]: E1127 16:04:29.195088 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:29 crc kubenswrapper[4707]: E1127 16:04:29.195253 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:29 crc kubenswrapper[4707]: E1127 16:04:29.195508 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:29 crc kubenswrapper[4707]: E1127 16:04:29.195716 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.251098 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.251165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.251190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.251225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.251248 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.322859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:29 crc kubenswrapper[4707]: E1127 16:04:29.323100 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:29 crc kubenswrapper[4707]: E1127 16:04:29.323937 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs podName:7d382481-3c1e-49ed-8e27-265d495aa776 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:37.323895153 +0000 UTC m=+52.955343961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs") pod "network-metrics-daemon-qcl5k" (UID: "7d382481-3c1e-49ed-8e27-265d495aa776") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.354846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.354926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.354948 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.354977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.354999 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.458094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.458144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.458162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.458189 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.458208 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.561348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.561754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.561933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.562063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.562184 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.666841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.667290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.667566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.667738 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.667879 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.771243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.771283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.771293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.771309 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.771318 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.874656 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.874706 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.874715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.874739 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.874750 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.982047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.982114 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.982142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.982179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:29 crc kubenswrapper[4707]: I1127 16:04:29.982206 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:29Z","lastTransitionTime":"2025-11-27T16:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.086091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.086154 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.086167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.086191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.086207 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.189987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.190174 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.190196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.190228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.190252 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.294010 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.294047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.294056 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.294076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.294087 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.396702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.396742 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.396754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.396773 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.396785 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.500183 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.500538 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.500640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.500738 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.500831 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.604697 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.604763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.604782 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.604809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.604831 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.707857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.707926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.707945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.707972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.707993 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.811812 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.811849 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.811858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.811876 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.811887 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.916130 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.916199 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.916223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.916255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:30 crc kubenswrapper[4707]: I1127 16:04:30.916279 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:30Z","lastTransitionTime":"2025-11-27T16:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.020667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.020745 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.020763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.020791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.020815 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.125262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.125357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.125419 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.125453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.125472 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.195100 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:31 crc kubenswrapper[4707]: E1127 16:04:31.195339 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.195751 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.195827 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.196042 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:31 crc kubenswrapper[4707]: E1127 16:04:31.196054 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:31 crc kubenswrapper[4707]: E1127 16:04:31.196199 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:31 crc kubenswrapper[4707]: E1127 16:04:31.196694 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.228230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.228290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.228310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.228335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.228358 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.332426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.332508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.332532 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.332561 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.332584 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.435909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.435969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.435988 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.436014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.436033 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.539358 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.539455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.539474 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.539505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.539526 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.642831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.642912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.642931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.642957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.642977 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.746776 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.746826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.746847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.746877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.746897 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.850731 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.850796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.850821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.850853 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.850881 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.955817 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.955898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.955925 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.955954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:31 crc kubenswrapper[4707]: I1127 16:04:31.955971 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:31Z","lastTransitionTime":"2025-11-27T16:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.060476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.060617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.060638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.060668 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.060687 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.165486 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.165573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.165615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.165651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.165676 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.196157 4707 scope.go:117] "RemoveContainer" containerID="def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.202324 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.202455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.202619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.202703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.202727 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: E1127 16:04:32.227864 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.235203 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.235276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.235295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.235326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.235348 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: E1127 16:04:32.258205 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.267592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.267647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.267664 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.267687 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.267701 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: E1127 16:04:32.288965 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.295458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.295582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.295614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.295653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.295684 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: E1127 16:04:32.323668 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.330846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.330921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.330938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.330960 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.330976 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: E1127 16:04:32.352772 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: E1127 16:04:32.353103 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.355302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.355363 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.355414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.355443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.355462 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.460358 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.460456 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.460478 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.460504 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.460517 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.564278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.564329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.564339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.564360 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.564394 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.667487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.667544 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.667562 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.667586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.667605 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.712555 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/1.log" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.719060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.719838 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.739505 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.758617 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.770236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.770290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.770310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.770341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.770362 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.786103 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.832299 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.850264 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.869175 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.872983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.873037 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.873054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.873082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.873097 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.881705 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.897044 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.929875 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.943021 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.958779 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.972288 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.976667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.976745 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.976767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.976795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.976816 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:32Z","lastTransitionTime":"2025-11-27T16:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:32 crc kubenswrapper[4707]: I1127 16:04:32.990997 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:32Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.010397 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.027678 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.046537 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.058284 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.079190 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.079231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.079258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.079280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.079293 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.182627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.182678 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.182689 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.182710 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.182722 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.194603 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:33 crc kubenswrapper[4707]: E1127 16:04:33.194731 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.194603 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.194770 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.194867 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:33 crc kubenswrapper[4707]: E1127 16:04:33.194947 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:33 crc kubenswrapper[4707]: E1127 16:04:33.194904 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:33 crc kubenswrapper[4707]: E1127 16:04:33.195129 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.285644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.285700 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.285713 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.285735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.285749 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.388533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.388579 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.388593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.388613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.388630 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.491287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.491421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.491449 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.491490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.491861 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.594135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.594192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.594209 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.594232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.594248 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.697470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.697538 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.697560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.697588 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.697609 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.726338 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/2.log" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.727361 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/1.log" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.732796 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5" exitCode=1 Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.732858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.732917 4707 scope.go:117] "RemoveContainer" containerID="def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.734175 4707 scope.go:117] "RemoveContainer" containerID="6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5" Nov 27 16:04:33 crc kubenswrapper[4707]: E1127 16:04:33.734503 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.765301 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.785786 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.801507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.801562 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.801579 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.801599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.801614 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.811547 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.834397 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.854421 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.875808 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.903589 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.905334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.905438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.905465 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.905502 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.905526 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:33Z","lastTransitionTime":"2025-11-27T16:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.939069 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://def6189807ad0285980b2859cdfe683dd5c8f7c79104d57abeef37450f73655d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:17Z\\\",\\\"message\\\":\\\"lversions/factory.go:140\\\\nI1127 16:04:17.841915 6098 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 16:04:17.841959 6098 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1127 16:04:17.842031 6098 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1127 16:04:17.842064 6098 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1127 16:04:17.842086 6098 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1127 16:04:17.842164 6098 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1127 16:04:17.842205 6098 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1127 16:04:17.842258 6098 factory.go:656] Stopping watch factory\\\\nI1127 16:04:17.842299 6098 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1127 16:04:17.842297 6098 handler.go:208] Removed *v1.Node event handler 2\\\\nI1127 16:04:17.842315 6098 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1127 16:04:17.842332 6098 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1127 16:04:17.842334 6098 handler.go:208] Removed *v1.Node event handler 7\\\\nI1127 16:04:17.842341 6098 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.960563 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:33 crc kubenswrapper[4707]: I1127 16:04:33.979097 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.008001 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:33Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.009862 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.009908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.009924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.009949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.009969 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.033043 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.055691 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.075880 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.098861 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.113603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.113662 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.113682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.113710 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.113728 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.117250 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.150495 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.216208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.216290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.216316 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.216347 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.216425 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.319877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.319942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.319964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.319998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.320020 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.423666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.423715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.423732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.423753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.423771 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.527223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.527613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.527806 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.527973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.528101 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.630944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.630996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.631016 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.631042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.631078 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.734784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.734858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.734883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.734918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.734938 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.739499 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/2.log" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.745172 4707 scope.go:117] "RemoveContainer" containerID="6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5" Nov 27 16:04:34 crc kubenswrapper[4707]: E1127 16:04:34.745502 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.766026 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.788886 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.807228 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.829124 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.838699 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.838753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.838771 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.838797 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.838816 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.853344 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.873542 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.893795 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.928872 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.946715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.946792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.946820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.946847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.946864 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:34Z","lastTransitionTime":"2025-11-27T16:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.953492 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.971330 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.989280 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:34Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:34 crc kubenswrapper[4707]: I1127 16:04:34.992300 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.010404 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.014814 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.037481 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.051261 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.051418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.051434 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.051455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.051499 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.074722 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.099033 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.119876 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.140761 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.154650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.154694 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.154707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.154727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.154743 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.162462 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.181732 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.195702 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:35 crc kubenswrapper[4707]: E1127 16:04:35.195888 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.196457 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:35 crc kubenswrapper[4707]: E1127 16:04:35.196537 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.196597 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:35 crc kubenswrapper[4707]: E1127 16:04:35.196666 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.196935 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:35 crc kubenswrapper[4707]: E1127 16:04:35.197210 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.205328 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.242527 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.258171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.258643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.258798 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.258949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.259109 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.263093 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.281829 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.301902 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.318957 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.341657 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.361234 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.362944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.362990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.363002 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.363027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.363040 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.379982 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.394563 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.427223 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.443886 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.459676 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.467185 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.467237 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.467250 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.467287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.467304 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.477238 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.492350 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.504892 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.520269 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.537198 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.551144 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.564525 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.570636 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.570696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.570717 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.570750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.570769 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.580003 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.594443 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.612824 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.627054 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.644668 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.657728 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.671820 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.673327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.673436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.673472 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.673501 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.673517 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.686256 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.702178 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.715306 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.734846 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.751972 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.766867 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.776782 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.776828 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.776841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.776880 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.776893 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.780738 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:35Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.879778 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.880091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.880108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.880128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.880140 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.982912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.982963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.982975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.982995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:35 crc kubenswrapper[4707]: I1127 16:04:35.983005 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:35Z","lastTransitionTime":"2025-11-27T16:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.085899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.085947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.085956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.085977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.085990 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.189441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.189539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.189560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.189597 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.189622 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.292568 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.292625 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.292633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.292655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.292666 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.395870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.395975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.396001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.396041 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.396067 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.500475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.500547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.500572 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.500608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.500632 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.603968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.604016 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.604027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.604044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.604055 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.707843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.707914 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.707933 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.707962 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.707981 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.811433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.811496 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.811514 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.811542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.811562 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.916246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.916324 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.916352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.916413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.916441 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:36Z","lastTransitionTime":"2025-11-27T16:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:36 crc kubenswrapper[4707]: I1127 16:04:36.928129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:04:36 crc kubenswrapper[4707]: E1127 16:04:36.928305 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:05:08.928266318 +0000 UTC m=+84.559715146 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.020142 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.020204 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.020222 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.020247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.020265 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.030636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.030801 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.030893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.030957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.031305 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.031345 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.031392 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.031510 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:05:09.031469043 +0000 UTC m=+84.662917841 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.032444 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.032476 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.032496 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.032565 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:05:09.03254701 +0000 UTC m=+84.663995818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.032741 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.032805 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:05:09.032777256 +0000 UTC m=+84.664226054 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.032874 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.032921 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:05:09.032907949 +0000 UTC m=+84.664356757 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.124593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.124686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.124707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.124734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.124753 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.194789 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.194793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.194890 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.195575 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.195792 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.195995 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.196162 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.196295 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.227686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.227778 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.227803 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.227833 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.227859 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.331230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.331275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.331298 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.331321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.331337 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.334114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.334324 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: E1127 16:04:37.334410 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs podName:7d382481-3c1e-49ed-8e27-265d495aa776 nodeName:}" failed. No retries permitted until 2025-11-27 16:04:53.334390622 +0000 UTC m=+68.965839400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs") pod "network-metrics-daemon-qcl5k" (UID: "7d382481-3c1e-49ed-8e27-265d495aa776") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.434852 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.434913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.434932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.434954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.434968 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.538539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.538611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.538626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.538650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.538664 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.679132 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.679192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.679211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.679242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.679264 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.781956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.782035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.782053 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.782080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.782104 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.885458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.885907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.886052 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.886196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.886315 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.990070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.990131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.990150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.990175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:37 crc kubenswrapper[4707]: I1127 16:04:37.990195 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:37Z","lastTransitionTime":"2025-11-27T16:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.093303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.093401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.093422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.093451 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.093471 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.197287 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.197400 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.197418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.197450 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.197469 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.300947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.301009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.301026 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.301059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.301077 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.404185 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.404251 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.404275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.404303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.404327 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.507549 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.507611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.507629 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.507659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.507678 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.611473 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.611547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.611566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.611599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.611618 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.715192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.715616 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.715807 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.715949 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.716084 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.819100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.819162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.819235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.819265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.819287 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.923740 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.923795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.923813 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.923839 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:38 crc kubenswrapper[4707]: I1127 16:04:38.923859 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:38Z","lastTransitionTime":"2025-11-27T16:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.027231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.027291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.027310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.027340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.027360 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.131033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.131117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.131136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.131163 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.131185 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.195057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:39 crc kubenswrapper[4707]: E1127 16:04:39.195249 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.195632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:39 crc kubenswrapper[4707]: E1127 16:04:39.195778 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.195858 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:39 crc kubenswrapper[4707]: E1127 16:04:39.195960 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.196130 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:39 crc kubenswrapper[4707]: E1127 16:04:39.196250 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.235161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.235232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.235242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.235264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.235278 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.339232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.339306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.339327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.339393 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.339415 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.443022 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.443101 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.443129 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.443165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.443186 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.546645 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.546719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.546736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.546763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.546783 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.650148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.650212 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.650231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.650257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.650278 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.753462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.753521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.753546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.753573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.753594 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.858600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.858660 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.858712 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.858747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.858769 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.962062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.962127 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.962148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.962179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:39 crc kubenswrapper[4707]: I1127 16:04:39.962200 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:39Z","lastTransitionTime":"2025-11-27T16:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.065566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.065634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.065649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.065674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.065690 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.169176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.169257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.169273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.169413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.169433 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.272775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.272852 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.272870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.272898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.272919 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.376571 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.376650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.376670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.376696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.376714 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.479500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.479540 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.479549 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.479568 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.479582 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.582505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.582556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.582573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.582600 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.582620 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.684667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.684729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.684750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.684778 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.684798 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.788306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.788410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.788432 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.788461 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.788481 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.891617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.891689 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.891703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.891726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.891748 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.995846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.995938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.995958 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.995989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:40 crc kubenswrapper[4707]: I1127 16:04:40.996009 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:40Z","lastTransitionTime":"2025-11-27T16:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.099908 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.099976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.099995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.100029 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.100049 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.194576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.194613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.194574 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.194702 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:41 crc kubenswrapper[4707]: E1127 16:04:41.195006 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:41 crc kubenswrapper[4707]: E1127 16:04:41.195271 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:41 crc kubenswrapper[4707]: E1127 16:04:41.195579 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:41 crc kubenswrapper[4707]: E1127 16:04:41.195720 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.204750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.204808 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.204828 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.204857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.204879 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.308792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.308873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.308896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.308929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.308953 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.412584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.412652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.412669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.412702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.412723 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.516100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.516159 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.516175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.516201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.516220 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.619268 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.619336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.619359 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.619427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.619451 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.722664 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.722744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.722767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.722796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.722818 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.826054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.826116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.826135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.826163 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.826192 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.929689 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.929774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.929792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.929849 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:41 crc kubenswrapper[4707]: I1127 16:04:41.929869 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:41Z","lastTransitionTime":"2025-11-27T16:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.033813 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.033868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.033882 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.033906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.033922 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.137855 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.137921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.137940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.137966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.137985 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.241074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.241152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.241170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.241196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.241216 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.344859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.344943 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.344965 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.344997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.345020 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.448233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.448326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.448350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.448416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.448437 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.527587 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.527651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.527667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.527693 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.527713 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: E1127 16:04:42.550060 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.555528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.555586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.555606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.555631 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.555651 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: E1127 16:04:42.575727 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.580940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.580995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.581013 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.581042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.581059 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: E1127 16:04:42.601366 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.606396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.606517 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.606537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.606559 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.606579 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: E1127 16:04:42.622030 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.628187 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.628264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.628282 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.628311 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.628331 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: E1127 16:04:42.650611 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:42Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:42 crc kubenswrapper[4707]: E1127 16:04:42.650984 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.653719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.653777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.653801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.653832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.653854 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.757088 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.757153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.757178 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.757205 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.757228 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.860722 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.860794 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.860816 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.860848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.860868 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.964283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.964393 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.964414 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.964441 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:42 crc kubenswrapper[4707]: I1127 16:04:42.964460 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:42Z","lastTransitionTime":"2025-11-27T16:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.067870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.067960 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.067980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.068008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.068028 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.171489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.171551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.171571 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.171598 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.171615 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.195001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.195067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.195067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.195189 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:43 crc kubenswrapper[4707]: E1127 16:04:43.195179 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:43 crc kubenswrapper[4707]: E1127 16:04:43.195297 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:43 crc kubenswrapper[4707]: E1127 16:04:43.195572 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:43 crc kubenswrapper[4707]: E1127 16:04:43.195865 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.274470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.274534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.274551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.274575 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.274593 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.376906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.376967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.376983 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.377007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.377024 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.479806 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.479847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.479856 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.479873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.479883 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.583611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.583686 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.583710 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.583746 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.583773 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.687568 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.687639 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.687653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.687682 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.687700 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.790538 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.790608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.790626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.790654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.790677 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.894416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.894500 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.894520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.894551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.894580 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.998458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.998524 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.998543 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.998569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:43 crc kubenswrapper[4707]: I1127 16:04:43.998588 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:43Z","lastTransitionTime":"2025-11-27T16:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.101776 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.101826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.101840 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.101863 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.101880 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.205439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.205503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.205521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.205547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.205565 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.309342 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.309476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.309502 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.309533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.309552 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.412937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.413000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.413018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.413042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.413061 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.516921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.516994 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.517018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.517050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.517071 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.620467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.620538 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.620564 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.620602 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.620629 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.724653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.724736 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.724755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.724783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.724802 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.828023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.828080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.828097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.828123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.828141 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.931975 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.932043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.932062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.932095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:44 crc kubenswrapper[4707]: I1127 16:04:44.932115 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:44Z","lastTransitionTime":"2025-11-27T16:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.037060 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.037148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.037174 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.037208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.037232 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.140947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.141014 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.141038 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.141070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.141091 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.194423 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.194424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:45 crc kubenswrapper[4707]: E1127 16:04:45.194662 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.194710 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:45 crc kubenswrapper[4707]: E1127 16:04:45.195001 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:45 crc kubenswrapper[4707]: E1127 16:04:45.195191 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.195644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:45 crc kubenswrapper[4707]: E1127 16:04:45.195893 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.230478 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.244809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.244879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.244902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.244930 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.244950 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.254297 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.274497 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.294055 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.319331 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.342454 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.348646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.348703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.348724 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.348750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.348769 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.363526 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.391872 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.427094 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.441444 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.452335 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.452399 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.452412 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.452435 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.452451 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.459845 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.474164 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.490109 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.503478 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.516477 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.532280 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.554859 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.555741 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.555807 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.555826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.555857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.555878 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.576041 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:45Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.658877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.659112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.659262 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.659477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.659642 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.763151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.763235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.763256 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.763283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.763304 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.866709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.866760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.866779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.866804 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.866822 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.970154 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.970218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.970236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.970261 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:45 crc kubenswrapper[4707]: I1127 16:04:45.970279 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:45Z","lastTransitionTime":"2025-11-27T16:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.073057 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.073128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.073156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.073185 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.073208 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.175867 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.175942 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.175970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.176001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.176024 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.281293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.281959 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.282018 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.282090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.282121 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.385606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.385666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.385684 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.385711 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.385730 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.488877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.488920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.488932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.488952 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.488965 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.593989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.594073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.594097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.594140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.594163 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.698188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.698248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.698302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.698328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.698348 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.801007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.801080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.801106 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.801136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.801160 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.904661 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.904738 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.904762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.904795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:46 crc kubenswrapper[4707]: I1127 16:04:46.904815 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:46Z","lastTransitionTime":"2025-11-27T16:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.009546 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.009603 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.009612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.009631 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.009642 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.112081 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.112128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.112147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.112171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.112188 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.194202 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.194250 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.194274 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:47 crc kubenswrapper[4707]: E1127 16:04:47.194427 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.194458 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:47 crc kubenswrapper[4707]: E1127 16:04:47.194606 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:47 crc kubenswrapper[4707]: E1127 16:04:47.194811 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:47 crc kubenswrapper[4707]: E1127 16:04:47.194936 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.215735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.215831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.215850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.215914 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.215935 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.319477 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.319539 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.319559 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.319585 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.319637 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.423110 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.423194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.423221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.423253 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.423279 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.526017 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.526073 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.526089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.526118 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.526137 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.628859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.628930 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.628954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.628985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.629007 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.731923 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.731967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.731978 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.732001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.732014 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.834778 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.834844 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.834963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.835000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.835020 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.939320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.939383 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.939392 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.939406 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:47 crc kubenswrapper[4707]: I1127 16:04:47.939418 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:47Z","lastTransitionTime":"2025-11-27T16:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.042815 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.042866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.042884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.042911 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.042933 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.145754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.145806 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.145823 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.145847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.145864 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.249992 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.250070 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.250093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.250125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.250147 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.353211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.353292 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.353319 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.353350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.353404 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.456480 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.456577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.456612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.456644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.456667 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.565112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.565176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.565193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.565221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.565234 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.668211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.668277 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.668290 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.668314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.668331 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.771967 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.772072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.772096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.772122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.772140 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.875223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.875264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.875275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.875295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.875310 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.978517 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.978563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.978573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.978590 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:48 crc kubenswrapper[4707]: I1127 16:04:48.978603 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:48Z","lastTransitionTime":"2025-11-27T16:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.081395 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.081437 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.081448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.081469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.081482 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.193089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.193131 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.193143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.193164 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.193176 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.194782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.194820 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:49 crc kubenswrapper[4707]: E1127 16:04:49.194988 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.195173 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.196968 4707 scope.go:117] "RemoveContainer" containerID="6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5" Nov 27 16:04:49 crc kubenswrapper[4707]: E1127 16:04:49.197460 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:04:49 crc kubenswrapper[4707]: E1127 16:04:49.197828 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:49 crc kubenswrapper[4707]: E1127 16:04:49.198075 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.200412 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:49 crc kubenswrapper[4707]: E1127 16:04:49.200620 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.295869 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.295921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.295936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.295957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.295970 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.398709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.398964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.399043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.399114 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.399174 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.502761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.502801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.502811 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.502835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.502847 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.606520 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.607510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.607729 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.607964 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.608172 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.712089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.712139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.712151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.712172 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.712185 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.814799 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.814866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.814875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.814894 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.814907 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.918074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.918138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.918151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.918176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:49 crc kubenswrapper[4707]: I1127 16:04:49.918194 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:49Z","lastTransitionTime":"2025-11-27T16:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.020750 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.020802 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.020813 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.020831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.020845 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.124355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.124445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.124461 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.124488 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.124508 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.229439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.229495 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.229513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.229537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.229555 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.332807 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.332879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.332902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.332930 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.332950 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.435201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.435241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.435250 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.435264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.435277 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.537171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.537230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.537249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.537273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.537292 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.640003 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.640063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.640076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.640094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.640105 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.742778 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.742861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.742881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.742905 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.742922 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.845873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.845918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.845927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.845943 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.845954 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.949148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.949193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.949202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.949218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:50 crc kubenswrapper[4707]: I1127 16:04:50.949228 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:50Z","lastTransitionTime":"2025-11-27T16:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.051950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.052036 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.052047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.052063 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.052074 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.155005 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.155059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.155074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.155095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.155114 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.195569 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:51 crc kubenswrapper[4707]: E1127 16:04:51.195697 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.195760 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.195846 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:51 crc kubenswrapper[4707]: E1127 16:04:51.195996 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:51 crc kubenswrapper[4707]: E1127 16:04:51.196099 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.196216 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:51 crc kubenswrapper[4707]: E1127 16:04:51.196323 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.258433 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.258493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.258506 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.258531 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.258548 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.361814 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.361867 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.361884 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.361910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.361928 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.465584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.465634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.465642 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.465660 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.465669 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.568561 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.568628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.568640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.568659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.568676 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.671866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.671912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.671927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.671945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.671958 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.775612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.775660 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.775670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.775687 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.775699 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.878653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.878704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.878716 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.878732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.878743 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.981329 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.981408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.981419 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.981436 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:51 crc kubenswrapper[4707]: I1127 16:04:51.981447 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:51Z","lastTransitionTime":"2025-11-27T16:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.084513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.084592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.084609 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.084641 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.084664 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.187122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.187201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.187219 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.187246 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.187266 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.289588 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.289637 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.289646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.289667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.289677 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.393336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.393452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.393509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.393543 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.393566 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.496152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.496216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.496244 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.496278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.496301 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.599203 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.599274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.599297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.599333 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.599360 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.702268 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.702345 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.702410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.702444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.702468 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.804597 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.804665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.804684 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.804711 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.804731 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.908059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.908150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.908176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.908206 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:52 crc kubenswrapper[4707]: I1127 16:04:52.908225 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:52Z","lastTransitionTime":"2025-11-27T16:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.010963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.011033 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.011059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.011093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.011117 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.035495 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.035541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.035560 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.035579 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.035595 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.055482 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:53Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.060586 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.060665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.060691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.060721 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.060745 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.081658 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:53Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.085830 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.085887 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.085906 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.085932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.085952 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.105603 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:53Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.110160 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.110200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.110211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.110233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.110248 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.130845 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:53Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.135169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.135236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.135255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.135281 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.135301 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.149826 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:53Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.150060 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.151633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.151676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.151695 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.151718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.151738 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.194680 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.194786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.194694 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.194873 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.194985 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.195060 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.195118 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.195188 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.255706 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.255779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.255797 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.255824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.255841 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.359263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.359327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.359347 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.359454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.359481 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.368047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.368223 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:53 crc kubenswrapper[4707]: E1127 16:04:53.368312 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs podName:7d382481-3c1e-49ed-8e27-265d495aa776 nodeName:}" failed. No retries permitted until 2025-11-27 16:05:25.368286065 +0000 UTC m=+100.999734863 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs") pod "network-metrics-daemon-qcl5k" (UID: "7d382481-3c1e-49ed-8e27-265d495aa776") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.461913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.461981 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.461995 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.462017 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.462031 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.564873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.564924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.564938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.564955 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.564967 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.667459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.667544 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.667575 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.667609 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.667632 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.771107 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.771162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.771181 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.771209 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.771230 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.874321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.874448 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.874475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.874506 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.874529 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.977398 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.977469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.977510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.977537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:53 crc kubenswrapper[4707]: I1127 16:04:53.977556 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:53Z","lastTransitionTime":"2025-11-27T16:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.079401 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.079459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.079480 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.079504 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.079522 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.183587 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.183651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.183674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.183696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.183712 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.286674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.286734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.286744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.286764 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.286789 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.390230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.390359 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.390427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.390454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.390503 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.494516 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.494621 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.494643 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.494679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.494699 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.598355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.598467 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.598485 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.598511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.598530 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.702664 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.702734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.702752 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.702777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.702792 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.806043 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.806086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.806096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.806113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.806125 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.825851 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/0.log" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.825910 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ca48c08-f39d-41a2-847a-c893a2111492" containerID="be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6" exitCode=1 Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.825952 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-js6mm" event={"ID":"9ca48c08-f39d-41a2-847a-c893a2111492","Type":"ContainerDied","Data":"be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.826491 4707 scope.go:117] "RemoveContainer" containerID="be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.846331 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.867144 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"2025-11-27T16:04:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2\\\\n2025-11-27T16:04:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2 to /host/opt/cni/bin/\\\\n2025-11-27T16:04:09Z [verbose] multus-daemon started\\\\n2025-11-27T16:04:09Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.892031 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.908513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.908565 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.908577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.908599 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.908616 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:54Z","lastTransitionTime":"2025-11-27T16:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.910178 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.927121 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.940611 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.959128 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.974422 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:54 crc kubenswrapper[4707]: I1127 16:04:54.985351 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:54Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.007632 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.012080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.012113 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.012120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.012136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.012150 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.022033 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.036029 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.052155 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.068243 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.085514 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.104356 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.114390 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.114426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.114435 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.114453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.114464 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.115180 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.125706 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.194932 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:55 crc kubenswrapper[4707]: E1127 16:04:55.195032 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.195217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:55 crc kubenswrapper[4707]: E1127 16:04:55.195267 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.195410 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:55 crc kubenswrapper[4707]: E1127 16:04:55.195457 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.195566 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:55 crc kubenswrapper[4707]: E1127 16:04:55.195626 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.213075 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.216872 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.217313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.217357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.217400 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.217425 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.217448 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.240002 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.275401 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.289967 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.307245 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.320452 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.320517 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.320530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.320555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.320570 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.328990 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.352343 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.373863 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.394452 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.415720 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.423836 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.423902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.423922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.423946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.423964 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.436529 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.460253 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.475412 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.495110 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.510179 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.523635 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.526956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.527030 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.527054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.527090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.527118 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.540068 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.558249 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"2025-11-27T16:04:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2\\\\n2025-11-27T16:04:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2 to /host/opt/cni/bin/\\\\n2025-11-27T16:04:09Z [verbose] multus-daemon started\\\\n2025-11-27T16:04:09Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.629703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.629771 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.629781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.629818 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.629835 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.733567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.733622 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.733634 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.733655 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.733667 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.832914 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/0.log" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.833014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-js6mm" event={"ID":"9ca48c08-f39d-41a2-847a-c893a2111492","Type":"ContainerStarted","Data":"094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.835693 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.835760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.835780 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.835813 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.835835 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.851150 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.879999 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.903427 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.919137 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.940816 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.940782 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.940883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.941039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.941074 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.941089 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:55Z","lastTransitionTime":"2025-11-27T16:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.957846 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.970988 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.985172 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:55 crc kubenswrapper[4707]: I1127 16:04:55.996971 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:55Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.013458 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.044240 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.044307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.044317 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.044341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.044354 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.052823 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.069679 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.086521 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.098841 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"991f0a41-0245-4bfa-afa1-84f5bde15111\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5738c0efc1bf2bd3189a81861d6548f0599bbbef152df871c689378185304b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.113759 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.124018 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.134620 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.151095 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.151447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.151530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.151660 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.151338 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.151749 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.171725 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"2025-11-27T16:04:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2\\\\n2025-11-27T16:04:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2 to /host/opt/cni/bin/\\\\n2025-11-27T16:04:09Z [verbose] multus-daemon started\\\\n2025-11-27T16:04:09Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:04:56Z is after 2025-08-24T17:21:41Z" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.255346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.255426 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.255439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.255459 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.255470 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.358077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.358123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.358134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.358156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.358169 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.461203 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.461264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.461276 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.461297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.461314 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.564670 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.565364 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.565821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.566104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.566311 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.670176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.670251 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.670264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.670286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.670301 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.773306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.773606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.773627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.773659 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.773679 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.876802 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.877425 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.877528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.877644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.878013 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.981065 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.981143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.981158 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.981181 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:56 crc kubenswrapper[4707]: I1127 16:04:56.981197 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:56Z","lastTransitionTime":"2025-11-27T16:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.084243 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.084308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.084324 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.084356 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.084389 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.188175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.188211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.188221 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.188238 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.188247 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.194613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.194672 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:57 crc kubenswrapper[4707]: E1127 16:04:57.194746 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:57 crc kubenswrapper[4707]: E1127 16:04:57.194882 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.195042 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.195064 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:57 crc kubenswrapper[4707]: E1127 16:04:57.195431 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:57 crc kubenswrapper[4707]: E1127 16:04:57.195650 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.290642 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.290681 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.290695 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.290712 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.290724 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.394509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.394556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.394566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.394582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.394592 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.497801 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.497870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.497889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.497917 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.497936 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.601522 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.601592 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.601611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.601638 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.601658 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.706001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.706066 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.706093 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.706123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.706147 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.809150 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.809271 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.809332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.809365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.809544 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.913082 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.913144 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.913165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.913196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:57 crc kubenswrapper[4707]: I1127 16:04:57.913217 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:57Z","lastTransitionTime":"2025-11-27T16:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.016228 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.016275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.016291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.016312 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.016330 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.119455 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.119526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.119541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.119563 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.119580 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.222718 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.222819 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.222839 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.222865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.222883 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.325732 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.325828 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.325849 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.325881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.325901 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.431213 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.431308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.431334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.431413 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.431447 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.544000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.544069 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.544089 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.544115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.544135 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.647556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.647619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.647644 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.647676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.647702 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.750504 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.750537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.750547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.750561 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.750571 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.853091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.853134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.853143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.853162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.853173 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.955755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.956526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.956605 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.956717 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:58 crc kubenswrapper[4707]: I1127 16:04:58.956810 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:58Z","lastTransitionTime":"2025-11-27T16:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.060146 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.060191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.060208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.060230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.060246 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.163156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.163200 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.163216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.163239 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.163258 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.198450 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:04:59 crc kubenswrapper[4707]: E1127 16:04:59.198601 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.198852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:04:59 crc kubenswrapper[4707]: E1127 16:04:59.198940 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.198994 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.199001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:04:59 crc kubenswrapper[4707]: E1127 16:04:59.199163 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:04:59 crc kubenswrapper[4707]: E1127 16:04:59.199244 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.266215 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.266266 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.266286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.266313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.266330 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.370547 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.370618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.370640 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.370669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.370690 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.474201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.474255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.474271 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.474291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.474302 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.578240 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.578274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.578283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.578297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.578304 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.681597 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.681651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.681672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.681696 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.681713 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.785593 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.785671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.785690 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.786542 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.786596 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.889091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.889128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.889139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.889156 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.889169 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.991410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.991447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.991458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.991475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:04:59 crc kubenswrapper[4707]: I1127 16:04:59.991486 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:04:59Z","lastTransitionTime":"2025-11-27T16:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.094346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.094410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.094422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.094439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.094451 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.198225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.198291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.198316 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.198349 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.198417 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.300637 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.300702 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.300739 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.300775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.300798 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.404497 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.404569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.404580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.404606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.404620 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.507927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.507991 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.508009 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.508040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.508058 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.611061 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.611129 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.611145 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.611175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.611196 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.731531 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.731615 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.731636 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.731666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.732527 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.836214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.836258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.836267 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.836285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.836297 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.939086 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.939475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.939731 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.940115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:00 crc kubenswrapper[4707]: I1127 16:05:00.940274 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:00Z","lastTransitionTime":"2025-11-27T16:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.043969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.045498 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.045685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.045886 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.046050 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.149945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.150023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.150047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.150079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.150102 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.195050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.195116 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.195071 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.195223 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:01 crc kubenswrapper[4707]: E1127 16:05:01.195244 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:01 crc kubenswrapper[4707]: E1127 16:05:01.195349 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:01 crc kubenswrapper[4707]: E1127 16:05:01.195613 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:01 crc kubenswrapper[4707]: E1127 16:05:01.195684 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.253316 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.253417 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.253438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.253468 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.253487 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.356867 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.356935 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.356947 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.356970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.356985 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.459788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.459928 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.460001 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.460027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.460083 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.563946 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.563998 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.564011 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.564038 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.564051 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.668045 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.668110 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.668123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.668148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.668163 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.771652 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.771719 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.771734 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.771753 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.771766 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.874510 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.874590 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.874608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.874631 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.874646 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.978048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.978609 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.978797 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.978954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:01 crc kubenswrapper[4707]: I1127 16:05:01.979118 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:01Z","lastTransitionTime":"2025-11-27T16:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.083000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.083669 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.083713 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.083749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.083782 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.187272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.187334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.187346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.187385 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.187397 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.290196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.290260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.290278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.290305 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.290326 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.393901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.393982 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.394003 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.394034 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.394054 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.497479 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.497554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.497579 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.497608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.497632 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.600233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.600292 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.600308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.600331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.600351 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.703896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.703969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.703993 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.704023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.704044 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.807319 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.807427 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.807462 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.807493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.807517 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.911080 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.911143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.911161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.911191 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:02 crc kubenswrapper[4707]: I1127 16:05:02.911211 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:02Z","lastTransitionTime":"2025-11-27T16:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.015211 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.015298 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.015320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.015350 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.015401 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.120554 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.120610 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.120626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.120649 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.120662 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.195693 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.195826 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.195746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.195911 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.196048 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.196152 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.196302 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.196432 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.223777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.223850 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.223873 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.223907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.223935 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.326476 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.326530 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.326548 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.326570 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.326585 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.416127 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.416166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.416176 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.416195 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.416208 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.431805 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:03Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.436708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.436761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.436772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.436791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.436802 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.456520 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:03Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.461509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.461550 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.461566 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.461590 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.461608 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.477283 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:03Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.486079 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.486161 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.486193 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.486219 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.486255 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.507314 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:03Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.512135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.512321 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.512526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.512677 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.512825 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.531612 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:03Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:03 crc kubenswrapper[4707]: E1127 16:05:03.531940 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.533630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.533679 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.533691 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.533709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.533722 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.635755 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.635823 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.635842 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.635867 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.635890 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.738203 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.738274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.738295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.738327 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.738350 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.840838 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.840903 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.840926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.840953 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.840975 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.944175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.944233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.944251 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.944277 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:03 crc kubenswrapper[4707]: I1127 16:05:03.944298 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:03Z","lastTransitionTime":"2025-11-27T16:05:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.046779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.046815 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.046827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.046844 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.046855 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.149762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.150551 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.150749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.150913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.151036 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.196140 4707 scope.go:117] "RemoveContainer" containerID="6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.253584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.253741 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.253760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.253784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.253804 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.356810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.356839 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.356851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.356868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.356879 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.459102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.459153 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.459169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.459196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.459219 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.563059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.563104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.563116 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.563132 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.563142 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.666123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.666196 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.666212 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.666235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.666249 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.769430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.769491 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.769505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.769526 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.769542 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.870220 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/2.log" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.872980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.873008 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.873019 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.873035 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.873046 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.873967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.875151 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.894168 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:04Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.958633 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:04Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.975471 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.975480 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:04Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.975521 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.975774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.975824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.975871 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:04Z","lastTransitionTime":"2025-11-27T16:05:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:04 crc kubenswrapper[4707]: I1127 16:05:04.994483 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:04Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.011572 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"991f0a41-0245-4bfa-afa1-84f5bde15111\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5738c0efc1bf2bd3189a81861d6548f0599bbbef152df871c689378185304b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.028812 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.045975 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.061472 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.080777 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.080854 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.080871 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.080895 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.080916 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.092400 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.105021 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.122143 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.136587 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"2025-11-27T16:04:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2\\\\n2025-11-27T16:04:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2 to /host/opt/cni/bin/\\\\n2025-11-27T16:04:09Z [verbose] multus-daemon started\\\\n2025-11-27T16:04:09Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.162868 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.178526 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.182991 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.183139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.183225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.183286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.183341 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.193825 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.194881 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:05 crc kubenswrapper[4707]: E1127 16:05:05.195048 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.195269 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:05 crc kubenswrapper[4707]: E1127 16:05:05.195395 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.195587 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:05 crc kubenswrapper[4707]: E1127 16:05:05.195694 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.196060 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:05 crc kubenswrapper[4707]: E1127 16:05:05.196226 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.208264 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.227292 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.239843 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.250435 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.262980 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.285044 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.285555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.285893 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.285909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.285926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.285938 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.310820 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.327705 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.350576 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.365067 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"991f0a41-0245-4bfa-afa1-84f5bde15111\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5738c0efc1bf2bd3189a81861d6548f0599bbbef152df871c689378185304b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.384797 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.389291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.389348 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.389361 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.389406 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.389425 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.407328 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.421635 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.437391 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.454696 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.491766 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.491842 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.491861 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.491950 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.492062 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.494217 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"2025-11-27T16:04:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2\\\\n2025-11-27T16:04:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2 to /host/opt/cni/bin/\\\\n2025-11-27T16:04:09Z [verbose] multus-daemon started\\\\n2025-11-27T16:04:09Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.522604 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.545808 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.565437 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.584215 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.595050 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.595105 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.595117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.595137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.595151 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.605692 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.627314 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.645036 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.722815 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.722860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.722871 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.722892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.722905 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.825692 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.826067 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.826085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.826111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.826129 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.880307 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/3.log" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.881451 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/2.log" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.885700 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" exitCode=1 Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.885776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.885887 4707 scope.go:117] "RemoveContainer" containerID="6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.886990 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:05:05 crc kubenswrapper[4707]: E1127 16:05:05.887310 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.914769 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.930585 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"991f0a41-0245-4bfa-afa1-84f5bde15111\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5738c0efc1bf2bd3189a81861d6548f0599bbbef152df871c689378185304b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.931047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.931117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.931141 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.931175 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.931199 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:05Z","lastTransitionTime":"2025-11-27T16:05:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.951688 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.970360 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:05 crc kubenswrapper[4707]: I1127 16:05:05.988631 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:05Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.012352 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.034709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.034759 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.034781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.034808 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.034829 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.045244 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6a6c3ac88a829633b920185df5c454e35c8447a7b1a9af096ab26e4fcabba5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:33Z\\\",\\\"message\\\":\\\"oSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 16:04:33.229791 6338 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-c995m in node crc\\\\nI1127 16:04:33.233295 6338 obj_retry.go:386] Retry successful for *v1.Pod openshift-machine-config-operator/machine-config-daemon-c995m after 0 failed attempt(s)\\\\nI1127 16:04:33.233305 6338 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/machine-config-daemon-c995m\\\\nI1127 16:04:33.229689 6338 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1127 16:04:33.233329 6338 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF1127 16:04:33.229645 6338 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:05:05Z\\\",\\\"message\\\":\\\"ns-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1127 16:05:05.221951 6686 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1127 16:05:05.222871 6686 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1127 16:05:05.222889 6686 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}\\\\nI1127 16:05:05.222908 6686 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 1.67666ms\\\\nI1127 16:05:05.222903 6686 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Target\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.060941 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.073657 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.085101 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.100352 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.119336 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"2025-11-27T16:04:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2\\\\n2025-11-27T16:04:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2 to /host/opt/cni/bin/\\\\n2025-11-27T16:04:09Z [verbose] multus-daemon started\\\\n2025-11-27T16:04:09Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.138231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.138291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.138310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.138336 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.138357 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.152714 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.174130 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.194177 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.210834 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.233731 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.241151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.241197 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.241214 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.241240 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.241259 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.252808 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.268778 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.344989 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.345048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.345065 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.345090 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.345107 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.448218 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.448271 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.448286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.448310 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.448330 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.550794 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.550853 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.550874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.550898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.550916 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.654117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.654167 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.654179 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.654202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.654213 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.758054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.758122 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.758139 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.758163 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.758184 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.861387 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.861423 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.861431 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.861446 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.861455 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.892974 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/3.log" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.899612 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:05:06 crc kubenswrapper[4707]: E1127 16:05:06.899867 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.955727 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.964708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.964763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.964781 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.964810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.964829 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:06Z","lastTransitionTime":"2025-11-27T16:05:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.974359 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:06 crc kubenswrapper[4707]: I1127 16:05:06.989653 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:06Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.011798 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.046910 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:05:05Z\\\",\\\"message\\\":\\\"ns-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1127 16:05:05.221951 6686 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1127 16:05:05.222871 6686 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1127 16:05:05.222889 6686 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}\\\\nI1127 16:05:05.222908 6686 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 1.67666ms\\\\nI1127 16:05:05.222903 6686 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Target\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:05:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.067740 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.068860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.068914 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.068931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.068987 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.069006 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.088332 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.106427 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"991f0a41-0245-4bfa-afa1-84f5bde15111\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5738c0efc1bf2bd3189a81861d6548f0599bbbef152df871c689378185304b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.126587 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.147448 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.165722 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.171361 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.171447 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.171465 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.171493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.171513 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.183033 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.194737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.194788 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.194797 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:07 crc kubenswrapper[4707]: E1127 16:05:07.194963 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.195002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:07 crc kubenswrapper[4707]: E1127 16:05:07.195126 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:07 crc kubenswrapper[4707]: E1127 16:05:07.195220 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:07 crc kubenswrapper[4707]: E1127 16:05:07.195365 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.203022 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.225602 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.248426 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"2025-11-27T16:04:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2\\\\n2025-11-27T16:04:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2 to /host/opt/cni/bin/\\\\n2025-11-27T16:04:09Z [verbose] multus-daemon started\\\\n2025-11-27T16:04:09Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.274784 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.274846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.274864 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.274891 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.274910 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.283818 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.307767 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.329563 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.347412 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:07Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.378754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.378823 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.378844 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.378879 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.378903 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.482731 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.482782 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.482793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.482811 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.482823 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.586286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.586347 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.586366 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.586419 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.586437 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.689260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.689357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.689397 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.689423 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.689440 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.792775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.792853 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.792870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.792898 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.792915 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.896220 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.896286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.896303 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.896328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:07 crc kubenswrapper[4707]: I1127 16:05:07.896346 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:07Z","lastTransitionTime":"2025-11-27T16:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.000283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.000346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.000394 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.000422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.000442 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.103340 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.103411 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.103430 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.103454 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.103472 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.205985 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.206045 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.206062 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.206085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.206101 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.309665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.309723 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.309741 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.309767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.309785 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.413418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.413493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.413511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.413534 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.413551 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.516545 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.516606 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.516624 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.516648 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.516718 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.619543 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.619611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.619632 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.619658 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.619675 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.723543 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.723608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.723627 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.723654 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.723677 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.827920 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.827980 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.827997 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.828024 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.828041 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.930537 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.930594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.930611 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.930637 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.930654 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:08Z","lastTransitionTime":"2025-11-27T16:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:08 crc kubenswrapper[4707]: I1127 16:05:08.970953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:05:08 crc kubenswrapper[4707]: E1127 16:05:08.971293 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.971265637 +0000 UTC m=+148.602714445 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.033589 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.033793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.033932 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.033963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.034017 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.072224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.072291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.072348 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.072456 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072611 4707 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072686 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.072661603 +0000 UTC m=+148.704110401 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072682 4707 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072736 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.072724414 +0000 UTC m=+148.704173212 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072802 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072833 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072840 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072904 4707 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072927 4707 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.072853 4707 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.073013 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.072981051 +0000 UTC m=+148.704429859 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.073089 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.073061103 +0000 UTC m=+148.704509901 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.136984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.137059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.137083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.137114 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.137140 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.194474 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.194572 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.194708 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.195069 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.195184 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.195470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.195612 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:09 crc kubenswrapper[4707]: E1127 16:05:09.195990 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.240278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.240344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.240363 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.240735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.241017 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.344838 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.344896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.344911 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.344937 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.344958 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.448201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.448283 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.448308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.448343 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.448399 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.552289 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.552630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.552683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.552708 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.553238 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.657045 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.657104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.657120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.657143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.657160 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.760996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.761094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.761110 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.761134 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.761148 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.864463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.864535 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.864559 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.864594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.864614 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.968160 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.968231 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.968247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.968278 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:09 crc kubenswrapper[4707]: I1127 16:05:09.968296 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:09Z","lastTransitionTime":"2025-11-27T16:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.071868 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.071945 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.071963 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.072005 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.072034 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.175913 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.175973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.175996 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.176027 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.176049 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.279859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.279926 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.279948 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.279969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.279986 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.383420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.383527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.383595 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.383626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.383649 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.486245 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.486328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.486354 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.486444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.486473 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.589444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.589503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.589524 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.589549 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.589568 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.692704 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.692772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.692792 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.692816 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.692833 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.797438 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.797511 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.797528 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.797555 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.797572 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.900517 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.900573 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.900591 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.900618 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:10 crc kubenswrapper[4707]: I1127 16:05:10.900638 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:10Z","lastTransitionTime":"2025-11-27T16:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.003334 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.003796 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.003831 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.003857 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.003888 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.108047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.108091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.108101 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.108120 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.108131 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.194709 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.194782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.194782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.194968 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:11 crc kubenswrapper[4707]: E1127 16:05:11.194965 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:11 crc kubenswrapper[4707]: E1127 16:05:11.195229 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:11 crc kubenswrapper[4707]: E1127 16:05:11.195340 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:11 crc kubenswrapper[4707]: E1127 16:05:11.195459 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.210910 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.210961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.210977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.211000 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.211015 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.316258 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.316302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.316314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.316331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.316342 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.420192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.420255 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.420274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.420302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.420319 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.524453 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.524490 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.524499 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.524517 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.524527 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.628020 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.628081 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.628100 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.628125 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.628146 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.731664 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.731715 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.731735 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.731762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.731781 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.835083 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.835147 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.835165 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.835189 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.835209 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.938799 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.938848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.938866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.938892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:11 crc kubenswrapper[4707]: I1127 16:05:11.938910 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:11Z","lastTransitionTime":"2025-11-27T16:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.042966 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.043036 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.043065 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.043094 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.043116 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.147209 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.148123 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.148274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.148463 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.148616 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.251549 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.251608 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.251626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.251651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.251704 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.355418 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.356601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.356642 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.356672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.356690 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.460170 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.460264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.460293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.460328 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.460354 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.563943 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.564247 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.564420 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.564567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.564707 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.668151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.668216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.668234 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.668260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.668278 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.771787 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.771849 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.771866 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.771892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.771910 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.875188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.875279 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.875326 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.875353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.875428 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.978768 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.978804 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.978820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.978841 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:12 crc kubenswrapper[4707]: I1127 16:05:12.978857 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:12Z","lastTransitionTime":"2025-11-27T16:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.081580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.081961 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.082115 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.082265 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.082434 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.186059 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.186138 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.186157 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.186184 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.186203 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.194496 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.194558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.194727 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.194849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.194847 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.195023 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.195183 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.195356 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.289173 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.289235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.289253 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.289275 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.289296 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.393249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.393297 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.393313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.393337 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.393354 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.497352 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.497465 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.497482 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.497508 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.497526 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.601199 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.601272 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.601294 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.601325 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.601348 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.705232 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.705318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.705331 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.705351 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.705412 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.800676 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.800744 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.800761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.800787 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.800803 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.825006 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.832012 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.832078 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.832097 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.832124 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.832141 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.853196 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.859054 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.859345 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.859565 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.859774 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.860040 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.883075 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.889483 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.889678 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.889809 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.889928 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.890040 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.910090 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.915045 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.915259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.915468 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.915666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.915848 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.938229 4707 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T16:05:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3f35bf4c-cf6f-4914-b0bb-f9cc6056fcaf\\\",\\\"systemUUID\\\":\\\"dd7e61b5-f6f4-4240-af78-d8fe5d6daad3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:13Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:13 crc kubenswrapper[4707]: E1127 16:05:13.938479 4707 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.940527 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.940779 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.940972 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.941155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:13 crc kubenswrapper[4707]: I1127 16:05:13.941331 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:13Z","lastTransitionTime":"2025-11-27T16:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.045671 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.045742 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.045760 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.045788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.045809 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.149292 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.149392 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.149416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.149444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.149467 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.252039 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.252096 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.252121 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.252148 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.252172 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.355730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.355788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.355814 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.355848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.355872 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.459294 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.460253 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.460424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.460567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.460718 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.563746 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.563808 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.563825 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.563858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.563877 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.667754 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.667820 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.667843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.667876 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.667898 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.770464 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.770543 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.770556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.770580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.770594 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.874647 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.875013 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.875042 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.875084 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.875107 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.977922 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.978023 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.978044 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.978072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:14 crc kubenswrapper[4707]: I1127 16:05:14.978093 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:14Z","lastTransitionTime":"2025-11-27T16:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.081032 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.081106 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.081128 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.081201 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.081232 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.185445 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.185515 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.185533 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.185562 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.185579 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.194933 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.194952 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.195080 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.195536 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:15 crc kubenswrapper[4707]: E1127 16:05:15.195532 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:15 crc kubenswrapper[4707]: E1127 16:05:15.195316 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:15 crc kubenswrapper[4707]: E1127 16:05:15.195630 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:15 crc kubenswrapper[4707]: E1127 16:05:15.195730 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.216540 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94c3c43-c57d-4580-82d4-22eb0ac9387c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f66a2313def4600426a075de4f77c75a876b3246344cec810b4cf25ea3505e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df8580fadfd8a1a25979050a7dc698d4fc8177b121311817f40c26e73bf9371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25dae96b066cc3da8d6fefd13955cb9d1776c3f9fa93afa6be256bb904f523c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fda5b69770eee7245fc07549784cad1211d45ceb0d65276764e36021cde11777\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.235076 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"991f0a41-0245-4bfa-afa1-84f5bde15111\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5738c0efc1bf2bd3189a81861d6548f0599bbbef152df871c689378185304b7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://803038378abbaeb0c775de20a85ccb96a9e420f5e0434a41ad1d8cda81bbd217\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.258762 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.284639 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57225b64e7671f397832e9ba112febbde78ae4c24b37a8f097d4bdeda8931c0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6960cfa0c6d0dedc432f90bf267ce02fe8264924a43865f79eadd9affbe955f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.289826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.289883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.289903 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.289931 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.289949 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.310327 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cf743e8776aa2de4138704d9e163af8baf7b8e5db139c13775673cf32a1ee88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.339149 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37bcfa90-e50b-4a63-96b2-cd8a0ce5bbf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66dcf851b2fb739c3dae214852a24405d2c8b4462dde998903b9dfe7c9e61e00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8cd7c5e0c832499cb9ddc7b18c7d260de06df34208f86284ebe77964bfcf1c12\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4438fd6ab08cd53dff20617ccadaa81e4bddbd7ced310f190d02613079f96f64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://19a31a44d1ecfee004f2a705df22a188782cd9f235848ddf5f48bd740833c6a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db8a59276d18141805bc74721d625e2004f2cea1f47efc90b3290c68eba2a35b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21efa1504e98747329f8a9220126ea259edde61bd4d272ea55691e9952950252\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://851c05dbd0947990d464d1bd72d62577bdf12517aea808904aa01fc79fd2d7c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hwzm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9c4xg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.373040 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55af9c67-18ce-46f1-a761-d11ce16f42d6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:05:05Z\\\",\\\"message\\\":\\\"ns-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1127 16:05:05.221951 6686 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI1127 16:05:05.222871 6686 services_controller.go:445] Built service openshift-dns-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1127 16:05:05.222889 6686 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/downloads\\\\\\\"}\\\\nI1127 16:05:05.222908 6686 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 1.67666ms\\\\nI1127 16:05:05.222903 6686 services_controller.go:451] Built service openshift-dns-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-dns-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.174\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Target\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:05:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6xmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xkmt7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.392960 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.393242 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.393388 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.393503 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.393592 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.393058 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"433a450e-2371-4893-b21e-19707b40e28f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12b178a129ce5efbfdb77d023236ed9d674d0aba3c3a2903589aae995a5a753c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://35888508a810deea27c61cbf1ec7362be9f61f99a63d4ab2b4c296a9ed47c65f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52jnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:19Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5x488\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.408221 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pdngz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a0b58a3-9ac2-4446-b507-88eba42aa060\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://feac173e7b0d6e3cb53bd8d407d3d08e59e099a63702a377b7515283f85703ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6sq6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:20Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pdngz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.425225 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d382481-3c1e-49ed-8e27-265d495aa776\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j2krv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qcl5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.448940 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b00fa8691d479d1db194704ff73bb85f1108c9541dd09ed4c1b02d0a3f0ab53d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.471818 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-js6mm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ca48c08-f39d-41a2-847a-c893a2111492\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T16:04:54Z\\\",\\\"message\\\":\\\"2025-11-27T16:04:08+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2\\\\n2025-11-27T16:04:08+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4a45ad71-8ab3-45c5-802b-cd135105e4f2 to /host/opt/cni/bin/\\\\n2025-11-27T16:04:09Z [verbose] multus-daemon started\\\\n2025-11-27T16:04:09Z [verbose] Readiness Indicator file check\\\\n2025-11-27T16:04:54Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bjt58\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-js6mm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.497296 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.497422 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.497442 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.497469 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.497486 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.508994 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e09a52-f64d-41da-a23b-7d3e34e8f109\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://489349489e52f4a113203c0c196cfe1ef96e97c6fc3c3d777a28e90f22aaee92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea14f8eb17dbb350f7c0ad2285b4bb5ca2d8b2aae6840646d68a397ac2043d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://775fc674c0dab97e0c5a2548d467b973c592458dad1b88ce92344486a3a28eb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f502a2429578531cb3ebbfde6bb4c049eed201a496ff3772c06ca28cace89b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30b52b26bdca5b0705186872e6dd2eba3ece6436e866a4a5e30f4c2ecdbbaf20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b6be834802616cdf10f8b694bb0ce1b8828d0885e8801c5d890e74837d672a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de1e5852273af33e41f21c4df29c3e32cbfd6b2eb200b64d9e5353a85718074c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6f7eb0e4b78b6d2e316bd2c8f2ad3081c8eb0ca3c55a7063ccf281572ba2992\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.536213 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f300806f-b97e-4453-b011-19442ca1240d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1127 16:03:58.832133 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 16:03:58.833970 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3721107874/tls.crt::/tmp/serving-cert-3721107874/tls.key\\\\\\\"\\\\nI1127 16:04:05.094602 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1127 16:04:05.099813 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1127 16:04:05.099850 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1127 16:04:05.099879 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1127 16:04:05.099886 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1127 16:04:05.108199 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1127 16:04:05.108225 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1127 16:04:05.108231 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108269 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1127 16:04:05.108276 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1127 16:04:05.108281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1127 16:04:05.108284 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1127 16:04:05.108289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1127 16:04:05.114674 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:48Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T16:03:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.560340 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.585275 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a83beb0d-8dd1-434a-ace2-933f98e3956f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://419d9b9f3f298028cf5f49afd438f1de90b8d1203cdfb09b924878b59e0fb723\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5xtzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c995m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.601195 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.601260 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.601279 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.601308 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.601327 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.608809 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f481770-a9a1-422a-a707-e841a24a1503\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:03:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0e4a5fa520e0dc048bef42e48e14bd06eb72586ca2b8e145734aa5450aa869d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c7144aa332827972c3ff984eb05b835b88c4d9d05552ac631fbca49f13e193a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8211f74a0c0d16257330ebbb5727247d4229848595af3cb65f1cb752b05f3c37\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:03:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:03:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.631551 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.648595 4707 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bhmsc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1ec526b-6fb0-4c87-bd87-6aaf843e0c78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T16:04:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3b764fd994edba08f5e909683bd065ecf09fb7c9906457e37bb91343aad14a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T16:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvs85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T16:04:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bhmsc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T16:05:15Z is after 2025-08-24T17:21:41Z" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.704513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.704577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.704596 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.704619 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.704636 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.807835 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.807907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.807929 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.807959 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.807989 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.911609 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.912040 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.912233 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.912479 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:15 crc kubenswrapper[4707]: I1127 16:05:15.912679 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:15Z","lastTransitionTime":"2025-11-27T16:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.016111 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.016173 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.016202 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.016239 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.016261 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.120489 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.120541 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.120562 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.120585 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.120603 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.224219 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.224285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.224307 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.224332 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.224352 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.327795 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.327875 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.327896 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.327924 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.327943 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.430689 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.430747 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.430763 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.430788 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.430808 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.533821 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.533870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.533889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.533938 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.533955 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.636814 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.636865 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.636881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.636901 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.636918 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.740189 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.740259 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.740296 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.740341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.740364 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.843783 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.843860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.843883 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.843915 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.843937 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.947250 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.947343 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.947408 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.947444 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:16 crc kubenswrapper[4707]: I1127 16:05:16.947467 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:16Z","lastTransitionTime":"2025-11-27T16:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.050765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.050825 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.051152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.051192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.051218 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.154625 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.154667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.154685 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.154709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.154728 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.194821 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.194862 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.194972 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:17 crc kubenswrapper[4707]: E1127 16:05:17.195059 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.195086 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:17 crc kubenswrapper[4707]: E1127 16:05:17.195268 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:17 crc kubenswrapper[4707]: E1127 16:05:17.195592 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:17 crc kubenswrapper[4707]: E1127 16:05:17.195668 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.257895 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.257957 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.257976 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.258003 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.258022 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.361171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.361236 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.361257 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.361285 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.361306 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.464512 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.464569 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.464587 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.464613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.464631 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.567155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.567874 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.568028 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.568177 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.568307 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.670295 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.670346 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.670364 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.670409 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.670425 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.773854 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.773902 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.773918 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.773940 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.773956 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.876667 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.876720 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.876737 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.876762 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.876781 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.980112 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.980166 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.980185 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.980210 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:17 crc kubenswrapper[4707]: I1127 16:05:17.980226 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:17Z","lastTransitionTime":"2025-11-27T16:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.083567 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.083612 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.083630 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.083653 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.083670 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.187293 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.187357 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.187424 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.187458 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.187483 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.196153 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:05:18 crc kubenswrapper[4707]: E1127 16:05:18.196502 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.290133 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.290188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.290208 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.290241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.290301 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.393223 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.393314 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.393342 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.393404 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.393430 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.496899 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.496979 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.496999 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.497031 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.497053 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.599658 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.599709 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.599727 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.599749 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.599769 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.702513 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.702576 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.702594 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.702617 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.702633 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.805582 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.805633 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.805650 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.805672 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.805690 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.909291 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.909365 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.909410 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.909439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:18 crc kubenswrapper[4707]: I1127 16:05:18.909458 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:18Z","lastTransitionTime":"2025-11-27T16:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.012140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.012192 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.012209 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.012230 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.012247 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.115703 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.115751 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.115770 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.115794 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.115815 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.195094 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.195213 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.195238 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:19 crc kubenswrapper[4707]: E1127 16:05:19.195299 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.195355 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:19 crc kubenswrapper[4707]: E1127 16:05:19.195603 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:19 crc kubenswrapper[4707]: E1127 16:05:19.195899 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:19 crc kubenswrapper[4707]: E1127 16:05:19.196006 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.218733 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.218790 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.218810 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.218832 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.218850 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.322136 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.322739 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.323104 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.323341 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.323865 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.427402 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.427475 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.427493 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.427518 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.427535 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.542577 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.542651 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.542674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.542707 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.542729 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.645726 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.645772 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.645791 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.645816 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.645836 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.749730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.749829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.749847 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.749877 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.749897 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.852984 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.853048 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.853072 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.853102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.853125 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.955558 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.955628 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.955646 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.955674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:19 crc kubenswrapper[4707]: I1127 16:05:19.955696 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:19Z","lastTransitionTime":"2025-11-27T16:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.059478 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.059565 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.059584 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.059614 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.059637 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.163509 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.163601 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.163626 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.163663 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.163692 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.267007 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.267068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.267085 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.267109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.267129 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.370605 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.370665 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.370683 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.370725 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.370763 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.474968 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.475068 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.475091 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.475117 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.475135 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.578274 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.578339 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.578353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.578425 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.578442 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.681304 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.681396 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.681416 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.681443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.681481 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.784858 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.784930 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.784944 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.784969 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.784984 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.911084 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.911140 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.911162 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.911186 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:20 crc kubenswrapper[4707]: I1127 16:05:20.911205 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:20Z","lastTransitionTime":"2025-11-27T16:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.014765 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.014826 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.014843 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.014870 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.014891 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.118289 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.118344 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.118363 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.118428 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.118448 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.194559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.194593 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.194770 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:21 crc kubenswrapper[4707]: E1127 16:05:21.194772 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.194827 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:21 crc kubenswrapper[4707]: E1127 16:05:21.195346 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:21 crc kubenswrapper[4707]: E1127 16:05:21.195442 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:21 crc kubenswrapper[4707]: E1127 16:05:21.195220 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.221889 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.221956 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.221977 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.222004 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.222023 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.325047 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.325107 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.325126 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.325152 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.325169 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.428349 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.428439 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.428457 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.428505 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.428524 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.531108 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.531169 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.531188 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.531216 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.531237 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.634102 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.634171 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.634195 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.634225 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.634246 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.737207 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.737264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.737286 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.737313 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.737337 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.840793 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.840859 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.840881 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.840909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.840929 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.944487 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.944556 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.944580 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.944635 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:21 crc kubenswrapper[4707]: I1127 16:05:21.944663 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:21Z","lastTransitionTime":"2025-11-27T16:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.098739 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.098807 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.098824 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.098851 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.098868 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.201830 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.201885 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.201911 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.201936 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.201953 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.304970 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.305051 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.305077 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.305109 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.305131 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.407834 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.407900 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.407921 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.407954 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.407975 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.511660 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.511730 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.511748 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.511775 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.511795 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.615135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.615198 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.615217 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.615241 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.615257 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.717909 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.717973 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.717990 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.718015 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.718031 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.821143 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.821199 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.821248 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.821273 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.821291 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.924613 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.924674 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.924690 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.924714 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:22 crc kubenswrapper[4707]: I1127 16:05:22.924731 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:22Z","lastTransitionTime":"2025-11-27T16:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.028194 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.028263 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.028280 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.028306 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.028323 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.132076 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.132137 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.132155 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.132235 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.132259 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.194576 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.194644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.194598 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:23 crc kubenswrapper[4707]: E1127 16:05:23.194797 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.194819 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:23 crc kubenswrapper[4707]: E1127 16:05:23.194943 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:23 crc kubenswrapper[4707]: E1127 16:05:23.195106 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:23 crc kubenswrapper[4707]: E1127 16:05:23.195192 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.235443 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.235507 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.235525 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.235548 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.235567 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.339226 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.339302 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.339320 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.339353 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.339410 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.442745 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.442807 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.442823 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.442848 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.442865 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.545927 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.546666 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.546723 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.546761 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.546783 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.651249 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.651318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.651342 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.651411 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.651441 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.755189 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.755264 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.755421 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.755470 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.755499 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.858767 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.858825 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.858839 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.858860 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.858877 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.962261 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.962318 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.962333 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.962355 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:23 crc kubenswrapper[4707]: I1127 16:05:23.962394 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:23Z","lastTransitionTime":"2025-11-27T16:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.065758 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.065827 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.065846 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.065907 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.065927 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:24Z","lastTransitionTime":"2025-11-27T16:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.170071 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.170135 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.170151 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.170178 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.170201 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:24Z","lastTransitionTime":"2025-11-27T16:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.217829 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.217892 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.217912 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.217974 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.217995 4707 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T16:05:24Z","lastTransitionTime":"2025-11-27T16:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.301400 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp"] Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.302170 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.305826 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.306029 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.306627 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.309936 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.354707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/733145d3-e1b2-45fc-9507-d9e60b22c1ce-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.354789 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/733145d3-e1b2-45fc-9507-d9e60b22c1ce-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.354809 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/733145d3-e1b2-45fc-9507-d9e60b22c1ce-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.354851 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733145d3-e1b2-45fc-9507-d9e60b22c1ce-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.354883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/733145d3-e1b2-45fc-9507-d9e60b22c1ce-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.374752 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.374730109 podStartE2EDuration="1m16.374730109s" podCreationTimestamp="2025-11-27 16:04:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.372139626 +0000 UTC m=+100.003588394" watchObservedRunningTime="2025-11-27 16:05:24.374730109 +0000 UTC m=+100.006178877" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.425302 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.425272428 podStartE2EDuration="1m19.425272428s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.401660374 +0000 UTC m=+100.033109202" watchObservedRunningTime="2025-11-27 16:05:24.425272428 +0000 UTC m=+100.056721196" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.455741 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/733145d3-e1b2-45fc-9507-d9e60b22c1ce-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.455808 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/733145d3-e1b2-45fc-9507-d9e60b22c1ce-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.455893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733145d3-e1b2-45fc-9507-d9e60b22c1ce-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.455928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/733145d3-e1b2-45fc-9507-d9e60b22c1ce-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.455970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/733145d3-e1b2-45fc-9507-d9e60b22c1ce-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.456101 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/733145d3-e1b2-45fc-9507-d9e60b22c1ce-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.455923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/733145d3-e1b2-45fc-9507-d9e60b22c1ce-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.459485 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/733145d3-e1b2-45fc-9507-d9e60b22c1ce-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.464462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/733145d3-e1b2-45fc-9507-d9e60b22c1ce-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.473652 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podStartSLOduration=79.473629284 podStartE2EDuration="1m19.473629284s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.446687729 +0000 UTC m=+100.078136527" watchObservedRunningTime="2025-11-27 16:05:24.473629284 +0000 UTC m=+100.105078082" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.474449 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.474438844 podStartE2EDuration="1m18.474438844s" podCreationTimestamp="2025-11-27 16:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.473553182 +0000 UTC m=+100.105001950" watchObservedRunningTime="2025-11-27 16:05:24.474438844 +0000 UTC m=+100.105887642" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.497621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/733145d3-e1b2-45fc-9507-d9e60b22c1ce-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jnwqp\" (UID: \"733145d3-e1b2-45fc-9507-d9e60b22c1ce\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.520817 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bhmsc" podStartSLOduration=79.520788821 podStartE2EDuration="1m19.520788821s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.52074047 +0000 UTC m=+100.152189248" watchObservedRunningTime="2025-11-27 16:05:24.520788821 +0000 UTC m=+100.152237599" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.582551 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9c4xg" podStartSLOduration=79.582523622 podStartE2EDuration="1m19.582523622s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.582198524 +0000 UTC m=+100.213647322" watchObservedRunningTime="2025-11-27 16:05:24.582523622 +0000 UTC m=+100.213972400" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.623416 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" Nov 27 16:05:24 crc kubenswrapper[4707]: W1127 16:05:24.648107 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod733145d3_e1b2_45fc_9507_d9e60b22c1ce.slice/crio-f2e196b5ed1e07f96da9b7cadab0f86e60e5fe711cdb76464006d02a148bcc21 WatchSource:0}: Error finding container f2e196b5ed1e07f96da9b7cadab0f86e60e5fe711cdb76464006d02a148bcc21: Status 404 returned error can't find the container with id f2e196b5ed1e07f96da9b7cadab0f86e60e5fe711cdb76464006d02a148bcc21 Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.649722 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5x488" podStartSLOduration=79.649683475 podStartE2EDuration="1m19.649683475s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.649259914 +0000 UTC m=+100.280708692" watchObservedRunningTime="2025-11-27 16:05:24.649683475 +0000 UTC m=+100.281132283" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.697635 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.69761152 podStartE2EDuration="29.69761152s" podCreationTimestamp="2025-11-27 16:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.697419055 +0000 UTC m=+100.328867853" watchObservedRunningTime="2025-11-27 16:05:24.69761152 +0000 UTC m=+100.329060298" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.698304 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=49.698296597 podStartE2EDuration="49.698296597s" podCreationTimestamp="2025-11-27 16:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.679318635 +0000 UTC m=+100.310767413" watchObservedRunningTime="2025-11-27 16:05:24.698296597 +0000 UTC m=+100.329745375" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.776070 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pdngz" podStartSLOduration=78.776033747 podStartE2EDuration="1m18.776033747s" podCreationTimestamp="2025-11-27 16:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.760891579 +0000 UTC m=+100.392340357" watchObservedRunningTime="2025-11-27 16:05:24.776033747 +0000 UTC m=+100.407482525" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.823103 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-js6mm" podStartSLOduration=79.823074231 podStartE2EDuration="1m19.823074231s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.822573918 +0000 UTC m=+100.454022696" watchObservedRunningTime="2025-11-27 16:05:24.823074231 +0000 UTC m=+100.454523019" Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.970535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" event={"ID":"733145d3-e1b2-45fc-9507-d9e60b22c1ce","Type":"ContainerStarted","Data":"0fc6a52ba1c47cc3907fdcdbfb030104c90fab0bc4defc2f97145fa42a1cb0bf"} Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.970622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" event={"ID":"733145d3-e1b2-45fc-9507-d9e60b22c1ce","Type":"ContainerStarted","Data":"f2e196b5ed1e07f96da9b7cadab0f86e60e5fe711cdb76464006d02a148bcc21"} Nov 27 16:05:24 crc kubenswrapper[4707]: I1127 16:05:24.988654 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jnwqp" podStartSLOduration=79.988623105 podStartE2EDuration="1m19.988623105s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:24.988573564 +0000 UTC m=+100.620022402" watchObservedRunningTime="2025-11-27 16:05:24.988623105 +0000 UTC m=+100.620071913" Nov 27 16:05:25 crc kubenswrapper[4707]: I1127 16:05:25.194715 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:25 crc kubenswrapper[4707]: I1127 16:05:25.194777 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:25 crc kubenswrapper[4707]: I1127 16:05:25.195243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:25 crc kubenswrapper[4707]: E1127 16:05:25.196080 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:25 crc kubenswrapper[4707]: I1127 16:05:25.196118 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:25 crc kubenswrapper[4707]: E1127 16:05:25.196247 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:25 crc kubenswrapper[4707]: E1127 16:05:25.196685 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:25 crc kubenswrapper[4707]: E1127 16:05:25.196755 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:25 crc kubenswrapper[4707]: I1127 16:05:25.369645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:25 crc kubenswrapper[4707]: E1127 16:05:25.369973 4707 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:05:25 crc kubenswrapper[4707]: E1127 16:05:25.370328 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs podName:7d382481-3c1e-49ed-8e27-265d495aa776 nodeName:}" failed. No retries permitted until 2025-11-27 16:06:29.370294745 +0000 UTC m=+165.001743543 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs") pod "network-metrics-daemon-qcl5k" (UID: "7d382481-3c1e-49ed-8e27-265d495aa776") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 16:05:27 crc kubenswrapper[4707]: I1127 16:05:27.194485 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:27 crc kubenswrapper[4707]: I1127 16:05:27.194555 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:27 crc kubenswrapper[4707]: E1127 16:05:27.194641 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:27 crc kubenswrapper[4707]: I1127 16:05:27.194491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:27 crc kubenswrapper[4707]: I1127 16:05:27.194769 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:27 crc kubenswrapper[4707]: E1127 16:05:27.194892 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:27 crc kubenswrapper[4707]: E1127 16:05:27.195047 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:27 crc kubenswrapper[4707]: E1127 16:05:27.195197 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:29 crc kubenswrapper[4707]: I1127 16:05:29.194237 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:29 crc kubenswrapper[4707]: E1127 16:05:29.194459 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:29 crc kubenswrapper[4707]: I1127 16:05:29.194747 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:29 crc kubenswrapper[4707]: E1127 16:05:29.194836 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:29 crc kubenswrapper[4707]: I1127 16:05:29.196144 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:05:29 crc kubenswrapper[4707]: E1127 16:05:29.196415 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:05:29 crc kubenswrapper[4707]: I1127 16:05:29.196673 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:29 crc kubenswrapper[4707]: E1127 16:05:29.196788 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:29 crc kubenswrapper[4707]: I1127 16:05:29.196977 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:29 crc kubenswrapper[4707]: E1127 16:05:29.197075 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:31 crc kubenswrapper[4707]: I1127 16:05:31.194294 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:31 crc kubenswrapper[4707]: I1127 16:05:31.194289 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:31 crc kubenswrapper[4707]: I1127 16:05:31.194492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:31 crc kubenswrapper[4707]: E1127 16:05:31.194666 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:31 crc kubenswrapper[4707]: I1127 16:05:31.194689 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:31 crc kubenswrapper[4707]: E1127 16:05:31.194842 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:31 crc kubenswrapper[4707]: E1127 16:05:31.194978 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:31 crc kubenswrapper[4707]: E1127 16:05:31.195191 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:33 crc kubenswrapper[4707]: I1127 16:05:33.194612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:33 crc kubenswrapper[4707]: I1127 16:05:33.194748 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:33 crc kubenswrapper[4707]: I1127 16:05:33.194803 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:33 crc kubenswrapper[4707]: E1127 16:05:33.194891 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:33 crc kubenswrapper[4707]: E1127 16:05:33.195018 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:33 crc kubenswrapper[4707]: I1127 16:05:33.195228 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:33 crc kubenswrapper[4707]: E1127 16:05:33.195402 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:33 crc kubenswrapper[4707]: E1127 16:05:33.195615 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:35 crc kubenswrapper[4707]: I1127 16:05:35.195078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:35 crc kubenswrapper[4707]: I1127 16:05:35.195210 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:35 crc kubenswrapper[4707]: I1127 16:05:35.195308 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:35 crc kubenswrapper[4707]: E1127 16:05:35.197044 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:35 crc kubenswrapper[4707]: I1127 16:05:35.197087 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:35 crc kubenswrapper[4707]: E1127 16:05:35.197232 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:35 crc kubenswrapper[4707]: E1127 16:05:35.197478 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:35 crc kubenswrapper[4707]: E1127 16:05:35.197743 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:37 crc kubenswrapper[4707]: I1127 16:05:37.194162 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:37 crc kubenswrapper[4707]: I1127 16:05:37.194225 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:37 crc kubenswrapper[4707]: I1127 16:05:37.194228 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:37 crc kubenswrapper[4707]: E1127 16:05:37.194330 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:37 crc kubenswrapper[4707]: I1127 16:05:37.194414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:37 crc kubenswrapper[4707]: E1127 16:05:37.194554 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:37 crc kubenswrapper[4707]: E1127 16:05:37.194716 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:37 crc kubenswrapper[4707]: E1127 16:05:37.194870 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:39 crc kubenswrapper[4707]: I1127 16:05:39.194635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:39 crc kubenswrapper[4707]: I1127 16:05:39.194696 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:39 crc kubenswrapper[4707]: I1127 16:05:39.194661 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:39 crc kubenswrapper[4707]: E1127 16:05:39.194884 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:39 crc kubenswrapper[4707]: I1127 16:05:39.194935 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:39 crc kubenswrapper[4707]: E1127 16:05:39.195093 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:39 crc kubenswrapper[4707]: E1127 16:05:39.195188 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:39 crc kubenswrapper[4707]: E1127 16:05:39.195314 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.032514 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/1.log" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.033041 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/0.log" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.033085 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ca48c08-f39d-41a2-847a-c893a2111492" containerID="094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542" exitCode=1 Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.033116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-js6mm" event={"ID":"9ca48c08-f39d-41a2-847a-c893a2111492","Type":"ContainerDied","Data":"094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542"} Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.033151 4707 scope.go:117] "RemoveContainer" containerID="be6a704ee152614d36d4831778b016c009b0aab2d6cf1c2ec9767f8ecdeb7af6" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.033927 4707 scope.go:117] "RemoveContainer" containerID="094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542" Nov 27 16:05:41 crc kubenswrapper[4707]: E1127 16:05:41.034307 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-js6mm_openshift-multus(9ca48c08-f39d-41a2-847a-c893a2111492)\"" pod="openshift-multus/multus-js6mm" podUID="9ca48c08-f39d-41a2-847a-c893a2111492" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.194678 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:41 crc kubenswrapper[4707]: E1127 16:05:41.194895 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.194687 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:41 crc kubenswrapper[4707]: E1127 16:05:41.195018 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.195018 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.195724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:41 crc kubenswrapper[4707]: E1127 16:05:41.195796 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:41 crc kubenswrapper[4707]: E1127 16:05:41.195982 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:41 crc kubenswrapper[4707]: I1127 16:05:41.196425 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:05:41 crc kubenswrapper[4707]: E1127 16:05:41.196719 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xkmt7_openshift-ovn-kubernetes(55af9c67-18ce-46f1-a761-d11ce16f42d6)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" Nov 27 16:05:42 crc kubenswrapper[4707]: I1127 16:05:42.038814 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/1.log" Nov 27 16:05:43 crc kubenswrapper[4707]: I1127 16:05:43.194201 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:43 crc kubenswrapper[4707]: I1127 16:05:43.194325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:43 crc kubenswrapper[4707]: I1127 16:05:43.194422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:43 crc kubenswrapper[4707]: E1127 16:05:43.194624 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:43 crc kubenswrapper[4707]: I1127 16:05:43.194670 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:43 crc kubenswrapper[4707]: E1127 16:05:43.194810 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:43 crc kubenswrapper[4707]: E1127 16:05:43.195965 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:43 crc kubenswrapper[4707]: E1127 16:05:43.196156 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:45 crc kubenswrapper[4707]: E1127 16:05:45.188770 4707 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 27 16:05:45 crc kubenswrapper[4707]: I1127 16:05:45.197598 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:45 crc kubenswrapper[4707]: I1127 16:05:45.197674 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:45 crc kubenswrapper[4707]: I1127 16:05:45.197634 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:45 crc kubenswrapper[4707]: I1127 16:05:45.197848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:45 crc kubenswrapper[4707]: E1127 16:05:45.197997 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:45 crc kubenswrapper[4707]: E1127 16:05:45.198129 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:45 crc kubenswrapper[4707]: E1127 16:05:45.198422 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:45 crc kubenswrapper[4707]: E1127 16:05:45.198545 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:45 crc kubenswrapper[4707]: E1127 16:05:45.346667 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 27 16:05:47 crc kubenswrapper[4707]: I1127 16:05:47.194974 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:47 crc kubenswrapper[4707]: I1127 16:05:47.195086 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:47 crc kubenswrapper[4707]: I1127 16:05:47.195105 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:47 crc kubenswrapper[4707]: E1127 16:05:47.195213 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:47 crc kubenswrapper[4707]: E1127 16:05:47.195498 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:47 crc kubenswrapper[4707]: I1127 16:05:47.195701 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:47 crc kubenswrapper[4707]: E1127 16:05:47.195825 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:47 crc kubenswrapper[4707]: E1127 16:05:47.195989 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:49 crc kubenswrapper[4707]: I1127 16:05:49.194422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:49 crc kubenswrapper[4707]: E1127 16:05:49.194689 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:49 crc kubenswrapper[4707]: I1127 16:05:49.194814 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:49 crc kubenswrapper[4707]: I1127 16:05:49.194814 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:49 crc kubenswrapper[4707]: E1127 16:05:49.195009 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:49 crc kubenswrapper[4707]: E1127 16:05:49.195154 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:49 crc kubenswrapper[4707]: I1127 16:05:49.194852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:49 crc kubenswrapper[4707]: E1127 16:05:49.195277 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:50 crc kubenswrapper[4707]: E1127 16:05:50.348602 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 27 16:05:51 crc kubenswrapper[4707]: I1127 16:05:51.194430 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:51 crc kubenswrapper[4707]: I1127 16:05:51.194573 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:51 crc kubenswrapper[4707]: I1127 16:05:51.194477 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:51 crc kubenswrapper[4707]: I1127 16:05:51.194727 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:51 crc kubenswrapper[4707]: E1127 16:05:51.194727 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:51 crc kubenswrapper[4707]: E1127 16:05:51.194879 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:51 crc kubenswrapper[4707]: E1127 16:05:51.195005 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:51 crc kubenswrapper[4707]: E1127 16:05:51.195106 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:52 crc kubenswrapper[4707]: I1127 16:05:52.196116 4707 scope.go:117] "RemoveContainer" containerID="094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542" Nov 27 16:05:52 crc kubenswrapper[4707]: I1127 16:05:52.196945 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.089067 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/1.log" Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.089731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-js6mm" event={"ID":"9ca48c08-f39d-41a2-847a-c893a2111492","Type":"ContainerStarted","Data":"3296b907d541dc79acbf2d75abe4ced1851608091496a03fc4a85f1879a836c6"} Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.093465 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/3.log" Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.097316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerStarted","Data":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.098228 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.192157 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podStartSLOduration=108.192125147 podStartE2EDuration="1m48.192125147s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:05:53.150503536 +0000 UTC m=+128.781952344" watchObservedRunningTime="2025-11-27 16:05:53.192125147 +0000 UTC m=+128.823573945" Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.193253 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qcl5k"] Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.193441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:53 crc kubenswrapper[4707]: E1127 16:05:53.193603 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.194132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.194272 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:53 crc kubenswrapper[4707]: I1127 16:05:53.194163 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:53 crc kubenswrapper[4707]: E1127 16:05:53.194670 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:53 crc kubenswrapper[4707]: E1127 16:05:53.195229 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:53 crc kubenswrapper[4707]: E1127 16:05:53.195475 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:55 crc kubenswrapper[4707]: I1127 16:05:55.194362 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:55 crc kubenswrapper[4707]: I1127 16:05:55.194362 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:55 crc kubenswrapper[4707]: I1127 16:05:55.194487 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:55 crc kubenswrapper[4707]: I1127 16:05:55.194533 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:55 crc kubenswrapper[4707]: E1127 16:05:55.196183 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:55 crc kubenswrapper[4707]: E1127 16:05:55.196334 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:55 crc kubenswrapper[4707]: E1127 16:05:55.196521 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:55 crc kubenswrapper[4707]: E1127 16:05:55.196673 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:55 crc kubenswrapper[4707]: E1127 16:05:55.349277 4707 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 27 16:05:57 crc kubenswrapper[4707]: I1127 16:05:57.195032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:57 crc kubenswrapper[4707]: I1127 16:05:57.195132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:57 crc kubenswrapper[4707]: E1127 16:05:57.195347 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:57 crc kubenswrapper[4707]: I1127 16:05:57.195064 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:57 crc kubenswrapper[4707]: E1127 16:05:57.195730 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:57 crc kubenswrapper[4707]: E1127 16:05:57.195833 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:57 crc kubenswrapper[4707]: I1127 16:05:57.196083 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:57 crc kubenswrapper[4707]: E1127 16:05:57.196208 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:05:59 crc kubenswrapper[4707]: I1127 16:05:59.194604 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:05:59 crc kubenswrapper[4707]: E1127 16:05:59.194782 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 16:05:59 crc kubenswrapper[4707]: I1127 16:05:59.195000 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:05:59 crc kubenswrapper[4707]: I1127 16:05:59.195088 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:05:59 crc kubenswrapper[4707]: I1127 16:05:59.195006 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:05:59 crc kubenswrapper[4707]: E1127 16:05:59.195272 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 16:05:59 crc kubenswrapper[4707]: E1127 16:05:59.195446 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 16:05:59 crc kubenswrapper[4707]: E1127 16:05:59.195604 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qcl5k" podUID="7d382481-3c1e-49ed-8e27-265d495aa776" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.194477 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.194477 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.194547 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.194559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.201885 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.202008 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.202274 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.202494 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.203022 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 27 16:06:01 crc kubenswrapper[4707]: I1127 16:06:01.203484 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.912805 4707 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.957160 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-thf5c"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.957913 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.959558 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qzltz"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.960279 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.961958 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zdxsk"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.962389 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zdxsk" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.969121 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.969634 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.969859 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.969981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.970098 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.970285 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.970356 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.971439 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gg89j"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.972353 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.972664 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.972909 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.973712 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.973890 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.974700 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.975073 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jqntr"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.975149 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.975849 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.982272 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.984819 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.985550 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.986563 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jxgm7"] Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.987422 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.991602 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.991932 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.992222 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.992506 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.993808 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.993920 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 16:06:04 crc kubenswrapper[4707]: I1127 16:06:04.993824 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.003038 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.004651 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.006142 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.006464 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.024738 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.026527 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.026905 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.026925 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.026910 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027060 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ghzc4"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027173 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027334 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027546 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027656 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027756 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027430 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.027680 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028132 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028263 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028172 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028418 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028470 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028767 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028856 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028906 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.028806 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.029006 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.029087 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.029038 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.029362 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.029490 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.029577 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kl4m7"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030171 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030233 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030296 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030601 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030682 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030817 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030873 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030941 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.030994 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.031059 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.031071 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.031290 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.031407 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.031444 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.031457 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.033740 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.036170 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.036290 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.039052 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.044186 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.046840 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.047577 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.047787 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.049032 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.049158 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.053451 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.054103 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.054337 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.054497 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.054766 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nhz7q"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.055000 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.055129 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.055602 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.056164 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwvst"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.062120 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.073263 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.073632 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.073922 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.074123 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.074329 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.075077 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tggrz"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.076482 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.077570 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.077854 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.078293 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.079484 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.079704 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.089323 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.089336 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.103539 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104094 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104310 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104514 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104605 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104680 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104705 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104720 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104839 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.104967 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.105046 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.105240 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.105263 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.105390 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.105790 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.105793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.110454 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.111299 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.111497 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.111890 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.112078 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.112531 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.112826 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.113012 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.113119 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.113849 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.114432 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.115821 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.115993 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.116788 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.116931 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.117123 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.117296 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.119660 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.120179 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.126339 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.126910 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-encryption-config\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a947df3-154e-488f-9e1c-4a41cc94553c-serving-cert\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-trusted-ca\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127353 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78cn9\" (UniqueName: \"kubernetes.io/projected/8a947df3-154e-488f-9e1c-4a41cc94553c-kube-api-access-78cn9\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-audit-policies\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zgc6\" (UniqueName: \"kubernetes.io/projected/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-kube-api-access-5zgc6\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrzcj\" (UniqueName: \"kubernetes.io/projected/30208d57-8abd-4956-92a1-1b1aa21b754a-kube-api-access-nrzcj\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127815 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127893 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lldmp\" (UniqueName: \"kubernetes.io/projected/dad82366-6967-4fba-9d3d-763ba28e9a73-kube-api-access-lldmp\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.127963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brk88\" (UniqueName: \"kubernetes.io/projected/ecc19675-2c51-4a55-b870-7906093e3de2-kube-api-access-brk88\") pod \"cluster-samples-operator-665b6dd947-4sj7l\" (UID: \"ecc19675-2c51-4a55-b870-7906093e3de2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f007d8e-df31-4bfd-8879-54a84ed4b62d-serving-cert\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2lh4\" (UniqueName: \"kubernetes.io/projected/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-kube-api-access-h2lh4\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-service-ca-bundle\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128614 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b735e79c-0093-4887-b27a-6e333d9d80a5-audit-dir\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dad82366-6967-4fba-9d3d-763ba28e9a73-auth-proxy-config\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128761 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx9hq\" (UniqueName: \"kubernetes.io/projected/1a3a3427-44f4-4518-861e-f11f5cb76d90-kube-api-access-tx9hq\") pod \"dns-operator-744455d44c-ghzc4\" (UID: \"1a3a3427-44f4-4518-861e-f11f5cb76d90\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.128913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30208d57-8abd-4956-92a1-1b1aa21b754a-serving-cert\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8a947df3-154e-488f-9e1c-4a41cc94553c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129229 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlb9g\" (UniqueName: \"kubernetes.io/projected/8f007d8e-df31-4bfd-8879-54a84ed4b62d-kube-api-access-xlb9g\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-serving-cert\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-policies\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkt6m\" (UniqueName: \"kubernetes.io/projected/b43ea736-a4b5-473d-a3a2-3d779a856a86-kube-api-access-qkt6m\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-config\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-client-ca\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ea736-a4b5-473d-a3a2-3d779a856a86-serving-cert\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129532 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5tvx\" (UniqueName: \"kubernetes.io/projected/78b8999d-9535-4584-baa0-5fd38838ac29-kube-api-access-s5tvx\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dad82366-6967-4fba-9d3d-763ba28e9a73-machine-approver-tls\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b8999d-9535-4584-baa0-5fd38838ac29-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-config\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129662 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-serving-cert\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129680 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129724 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986fb\" (UniqueName: \"kubernetes.io/projected/50e5ce2e-3776-4890-8b73-b0ae2b2d0237-kube-api-access-986fb\") pod \"downloads-7954f5f757-zdxsk\" (UID: \"50e5ce2e-3776-4890-8b73-b0ae2b2d0237\") " pod="openshift-console/downloads-7954f5f757-zdxsk" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129742 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b8999d-9535-4584-baa0-5fd38838ac29-config\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-etcd-client\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-config\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecc19675-2c51-4a55-b870-7906093e3de2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4sj7l\" (UID: \"ecc19675-2c51-4a55-b870-7906093e3de2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129817 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-dir\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129850 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-config\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129920 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-client-ca\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129949 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndwp\" (UniqueName: \"kubernetes.io/projected/b735e79c-0093-4887-b27a-6e333d9d80a5-kube-api-access-wndwp\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.129996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a3a3427-44f4-4518-861e-f11f5cb76d90-metrics-tls\") pod \"dns-operator-744455d44c-ghzc4\" (UID: \"1a3a3427-44f4-4518-861e-f11f5cb76d90\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.130105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad82366-6967-4fba-9d3d-763ba28e9a73-config\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.130159 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.130181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78b8999d-9535-4584-baa0-5fd38838ac29-images\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.130206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.134455 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.135168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.139527 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-khskd"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.144416 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.144954 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.145646 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.148291 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.149384 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5r8tf"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.150119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.152015 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.154583 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.156601 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72lr4"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.168441 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.174271 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.175418 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.176038 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qzltz"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.176065 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.176722 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d8nfn"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.177490 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jtqhs"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.178235 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.178543 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.178580 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.180251 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zdxsk"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.180742 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.180992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.181180 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.185021 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.196220 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.203906 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.207037 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.207754 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.208125 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gg89j"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.208147 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-thf5c"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.208245 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.208598 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.212293 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.213053 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.213181 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.213215 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jxgm7"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.217419 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.217648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ghzc4"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.219012 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-slj9l"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.220111 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.220148 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-85rvh"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.220651 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.221417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.222129 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.224309 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jqntr"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.225540 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.227104 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.229795 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986fb\" (UniqueName: \"kubernetes.io/projected/50e5ce2e-3776-4890-8b73-b0ae2b2d0237-kube-api-access-986fb\") pod \"downloads-7954f5f757-zdxsk\" (UID: \"50e5ce2e-3776-4890-8b73-b0ae2b2d0237\") " pod="openshift-console/downloads-7954f5f757-zdxsk" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b8999d-9535-4584-baa0-5fd38838ac29-config\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231244 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-etcd-client\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-config\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231303 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecc19675-2c51-4a55-b870-7906093e3de2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4sj7l\" (UID: \"ecc19675-2c51-4a55-b870-7906093e3de2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-dir\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-config\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231385 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9744e361-2c35-436c-a453-984cff9d923f-node-pullsecrets\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-config\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-image-import-ca\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-client-ca\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-audit\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-serving-cert\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231500 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wndwp\" (UniqueName: \"kubernetes.io/projected/b735e79c-0093-4887-b27a-6e333d9d80a5-kube-api-access-wndwp\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a3a3427-44f4-4518-861e-f11f5cb76d90-metrics-tls\") pod \"dns-operator-744455d44c-ghzc4\" (UID: \"1a3a3427-44f4-4518-861e-f11f5cb76d90\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231560 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-metrics-certs\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad82366-6967-4fba-9d3d-763ba28e9a73-config\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231638 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78b8999d-9535-4584-baa0-5fd38838ac29-images\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-encryption-config\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a947df3-154e-488f-9e1c-4a41cc94553c-serving-cert\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-encryption-config\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b37b9b2e-05d2-434f-bd01-93cda5a05b52-service-ca-bundle\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-trusted-ca\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78cn9\" (UniqueName: \"kubernetes.io/projected/8a947df3-154e-488f-9e1c-4a41cc94553c-kube-api-access-78cn9\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-audit-policies\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231914 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zgc6\" (UniqueName: \"kubernetes.io/projected/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-kube-api-access-5zgc6\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.231992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrzcj\" (UniqueName: \"kubernetes.io/projected/30208d57-8abd-4956-92a1-1b1aa21b754a-kube-api-access-nrzcj\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f007d8e-df31-4bfd-8879-54a84ed4b62d-serving-cert\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lldmp\" (UniqueName: \"kubernetes.io/projected/dad82366-6967-4fba-9d3d-763ba28e9a73-kube-api-access-lldmp\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brk88\" (UniqueName: \"kubernetes.io/projected/ecc19675-2c51-4a55-b870-7906093e3de2-kube-api-access-brk88\") pod \"cluster-samples-operator-665b6dd947-4sj7l\" (UID: \"ecc19675-2c51-4a55-b870-7906093e3de2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21940f54-9029-467c-aaac-7580e44761a4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-stats-auth\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21940f54-9029-467c-aaac-7580e44761a4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bae1e9aa-742e-4e00-ba4a-2edad6d45e95-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p28h9\" (UID: \"bae1e9aa-742e-4e00-ba4a-2edad6d45e95\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2lh4\" (UniqueName: \"kubernetes.io/projected/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-kube-api-access-h2lh4\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232391 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-service-ca-bundle\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b735e79c-0093-4887-b27a-6e333d9d80a5-audit-dir\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232501 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dad82366-6967-4fba-9d3d-763ba28e9a73-auth-proxy-config\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-etcd-serving-ca\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749dv\" (UniqueName: \"kubernetes.io/projected/b37b9b2e-05d2-434f-bd01-93cda5a05b52-kube-api-access-749dv\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232617 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-config\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcfb\" (UniqueName: \"kubernetes.io/projected/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-kube-api-access-vbcfb\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx9hq\" (UniqueName: \"kubernetes.io/projected/1a3a3427-44f4-4518-861e-f11f5cb76d90-kube-api-access-tx9hq\") pod \"dns-operator-744455d44c-ghzc4\" (UID: \"1a3a3427-44f4-4518-861e-f11f5cb76d90\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkclw\" (UniqueName: \"kubernetes.io/projected/21940f54-9029-467c-aaac-7580e44761a4-kube-api-access-xkclw\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8a947df3-154e-488f-9e1c-4a41cc94553c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30208d57-8abd-4956-92a1-1b1aa21b754a-serving-cert\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232762 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232790 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b8999d-9535-4584-baa0-5fd38838ac29-config\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232812 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-policies\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlb9g\" (UniqueName: \"kubernetes.io/projected/8f007d8e-df31-4bfd-8879-54a84ed4b62d-kube-api-access-xlb9g\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-serving-cert\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.232977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dv7g\" (UniqueName: \"kubernetes.io/projected/9744e361-2c35-436c-a453-984cff9d923f-kube-api-access-4dv7g\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkt6m\" (UniqueName: \"kubernetes.io/projected/b43ea736-a4b5-473d-a3a2-3d779a856a86-kube-api-access-qkt6m\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233058 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-config\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-client-ca\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233099 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ea736-a4b5-473d-a3a2-3d779a856a86-serving-cert\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dad82366-6967-4fba-9d3d-763ba28e9a73-machine-approver-tls\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5tvx\" (UniqueName: \"kubernetes.io/projected/78b8999d-9535-4584-baa0-5fd38838ac29-kube-api-access-s5tvx\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6nph\" (UniqueName: \"kubernetes.io/projected/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-kube-api-access-c6nph\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4hb\" (UniqueName: \"kubernetes.io/projected/bae1e9aa-742e-4e00-ba4a-2edad6d45e95-kube-api-access-rk4hb\") pod \"package-server-manager-789f6589d5-p28h9\" (UID: \"bae1e9aa-742e-4e00-ba4a-2edad6d45e95\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b8999d-9535-4584-baa0-5fd38838ac29-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233239 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9744e361-2c35-436c-a453-984cff9d923f-audit-dir\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233283 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-policies\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233308 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-config\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233331 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-default-certificate\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-etcd-client\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233452 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-serving-cert\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.233909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-config\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.234513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-client-ca\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.234960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.235469 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.235447 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-config\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.235505 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-slj9l"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.236281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-client-ca\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.236776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-config\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.239738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f007d8e-df31-4bfd-8879-54a84ed4b62d-serving-cert\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.240093 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.240190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.240200 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.240222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a3a3427-44f4-4518-861e-f11f5cb76d90-metrics-tls\") pod \"dns-operator-744455d44c-ghzc4\" (UID: \"1a3a3427-44f4-4518-861e-f11f5cb76d90\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.240245 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.240362 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.240792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad82366-6967-4fba-9d3d-763ba28e9a73-config\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.241648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b8999d-9535-4584-baa0-5fd38838ac29-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.242029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-etcd-client\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.242084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-dir\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.242334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b735e79c-0093-4887-b27a-6e333d9d80a5-audit-dir\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.244088 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dad82366-6967-4fba-9d3d-763ba28e9a73-auth-proxy-config\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.244580 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.245128 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.245345 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.245459 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.245525 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nhz7q"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.246444 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-service-ca-bundle\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.247929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.248093 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-serving-cert\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.248212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.248847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.248971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78b8999d-9535-4584-baa0-5fd38838ac29-images\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.249609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-audit-policies\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.250132 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.250186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-serving-cert\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.250356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecc19675-2c51-4a55-b870-7906093e3de2-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4sj7l\" (UID: \"ecc19675-2c51-4a55-b870-7906093e3de2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.250507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dad82366-6967-4fba-9d3d-763ba28e9a73-machine-approver-tls\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.250812 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.250977 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b735e79c-0093-4887-b27a-6e333d9d80a5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.251126 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8a947df3-154e-488f-9e1c-4a41cc94553c-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.251429 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-trusted-ca\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.251775 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f007d8e-df31-4bfd-8879-54a84ed4b62d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.252418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ea736-a4b5-473d-a3a2-3d779a856a86-serving-cert\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.252682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-config\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.252975 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5r8tf"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.254273 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.254954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.255027 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.255535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.255572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30208d57-8abd-4956-92a1-1b1aa21b754a-serving-cert\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.255726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b735e79c-0093-4887-b27a-6e333d9d80a5-encryption-config\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.256019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.256578 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72lr4"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.256777 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.257563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a947df3-154e-488f-9e1c-4a41cc94553c-serving-cert\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.258410 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwvst"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.259675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.260682 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zgfxc"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.261435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgfxc" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.263360 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kl4m7"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.264636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.265779 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.267347 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zg2ss"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.268138 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.268746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-khskd"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.269995 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.271274 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.272800 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d8nfn"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.274304 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.275554 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zg2ss"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.276704 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.279330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jtqhs"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.281138 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zgfxc"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.282463 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4"] Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.282912 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.300842 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.320521 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.334107 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.334229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-encryption-config\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.334302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b37b9b2e-05d2-434f-bd01-93cda5a05b52-service-ca-bundle\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.334406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.334524 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.334694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.334884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.335008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.335086 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b37b9b2e-05d2-434f-bd01-93cda5a05b52-service-ca-bundle\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.335191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.335300 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21940f54-9029-467c-aaac-7580e44761a4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.335411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-stats-auth\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.335602 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21940f54-9029-467c-aaac-7580e44761a4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337426 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bae1e9aa-742e-4e00-ba4a-2edad6d45e95-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p28h9\" (UID: \"bae1e9aa-742e-4e00-ba4a-2edad6d45e95\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337538 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-etcd-serving-ca\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.336174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21940f54-9029-467c-aaac-7580e44761a4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337629 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-749dv\" (UniqueName: \"kubernetes.io/projected/b37b9b2e-05d2-434f-bd01-93cda5a05b52-kube-api-access-749dv\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337755 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcfb\" (UniqueName: \"kubernetes.io/projected/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-kube-api-access-vbcfb\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-config\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337816 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkclw\" (UniqueName: \"kubernetes.io/projected/21940f54-9029-467c-aaac-7580e44761a4-kube-api-access-xkclw\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dv7g\" (UniqueName: \"kubernetes.io/projected/9744e361-2c35-436c-a453-984cff9d923f-kube-api-access-4dv7g\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337908 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6nph\" (UniqueName: \"kubernetes.io/projected/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-kube-api-access-c6nph\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4hb\" (UniqueName: \"kubernetes.io/projected/bae1e9aa-742e-4e00-ba4a-2edad6d45e95-kube-api-access-rk4hb\") pod \"package-server-manager-789f6589d5-p28h9\" (UID: \"bae1e9aa-742e-4e00-ba4a-2edad6d45e95\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9744e361-2c35-436c-a453-984cff9d923f-audit-dir\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.337985 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9744e361-2c35-436c-a453-984cff9d923f-audit-dir\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-default-certificate\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-etcd-client\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9744e361-2c35-436c-a453-984cff9d923f-node-pullsecrets\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338219 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-config\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-image-import-ca\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338274 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-audit\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338295 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-serving-cert\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-metrics-certs\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9744e361-2c35-436c-a453-984cff9d923f-node-pullsecrets\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.338891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21940f54-9029-467c-aaac-7580e44761a4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.339153 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.339596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.340826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-default-certificate\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.341184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-metrics-certs\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.342708 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.347049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b37b9b2e-05d2-434f-bd01-93cda5a05b52-stats-auth\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.359431 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.380233 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.400320 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.409822 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.420747 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.426191 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.440528 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.460199 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.468308 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.481194 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.499725 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.509498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-config\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.523444 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.542393 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.566891 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.579029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.581011 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.585523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.600582 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.621517 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.640987 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.681579 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.689233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-etcd-serving-ca\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.701235 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.721234 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.733044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-etcd-client\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.741229 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.753068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-serving-cert\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.761471 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.769138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9744e361-2c35-436c-a453-984cff9d923f-encryption-config\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.781460 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.789705 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-image-import-ca\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.812693 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.817265 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.820709 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.840856 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.862004 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.870841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-config\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.880621 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.890098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9744e361-2c35-436c-a453-984cff9d923f-audit\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.921720 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.934496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bae1e9aa-742e-4e00-ba4a-2edad6d45e95-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-p28h9\" (UID: \"bae1e9aa-742e-4e00-ba4a-2edad6d45e95\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.940683 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.960773 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 27 16:06:05 crc kubenswrapper[4707]: I1127 16:06:05.989909 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.001825 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.021515 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.040992 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.061588 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.081960 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.100466 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.121513 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.150877 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.160634 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.179426 4707 request.go:700] Waited for 1.000834116s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dmarketplace-trusted-ca&limit=500&resourceVersion=0 Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.192050 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.201187 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.220358 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.241340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.262962 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.281602 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.300931 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.321563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.342458 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.361335 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.381033 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.400319 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.423207 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.440629 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.462224 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.481707 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.502417 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.522688 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.541052 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.561608 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.581266 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.601551 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.620773 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.641541 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.662312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.682251 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.702443 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.720305 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.741795 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.764007 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.781205 4707 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.801111 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.821969 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.841908 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.860857 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.913925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986fb\" (UniqueName: \"kubernetes.io/projected/50e5ce2e-3776-4890-8b73-b0ae2b2d0237-kube-api-access-986fb\") pod \"downloads-7954f5f757-zdxsk\" (UID: \"50e5ce2e-3776-4890-8b73-b0ae2b2d0237\") " pod="openshift-console/downloads-7954f5f757-zdxsk" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.929288 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wndwp\" (UniqueName: \"kubernetes.io/projected/b735e79c-0093-4887-b27a-6e333d9d80a5-kube-api-access-wndwp\") pod \"apiserver-7bbb656c7d-hg2x5\" (UID: \"b735e79c-0093-4887-b27a-6e333d9d80a5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.945185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lldmp\" (UniqueName: \"kubernetes.io/projected/dad82366-6967-4fba-9d3d-763ba28e9a73-kube-api-access-lldmp\") pod \"machine-approver-56656f9798-xfj8l\" (UID: \"dad82366-6967-4fba-9d3d-763ba28e9a73\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.945616 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.955319 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.969624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brk88\" (UniqueName: \"kubernetes.io/projected/ecc19675-2c51-4a55-b870-7906093e3de2-kube-api-access-brk88\") pod \"cluster-samples-operator-665b6dd947-4sj7l\" (UID: \"ecc19675-2c51-4a55-b870-7906093e3de2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" Nov 27 16:06:06 crc kubenswrapper[4707]: W1127 16:06:06.976576 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddad82366_6967_4fba_9d3d_763ba28e9a73.slice/crio-7df7a178961a47bc88271aaf9b97d5631a1293e1e0861d3857f6e94308a462b6 WatchSource:0}: Error finding container 7df7a178961a47bc88271aaf9b97d5631a1293e1e0861d3857f6e94308a462b6: Status 404 returned error can't find the container with id 7df7a178961a47bc88271aaf9b97d5631a1293e1e0861d3857f6e94308a462b6 Nov 27 16:06:06 crc kubenswrapper[4707]: I1127 16:06:06.980173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlb9g\" (UniqueName: \"kubernetes.io/projected/8f007d8e-df31-4bfd-8879-54a84ed4b62d-kube-api-access-xlb9g\") pod \"authentication-operator-69f744f599-kl4m7\" (UID: \"8f007d8e-df31-4bfd-8879-54a84ed4b62d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.000846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2lh4\" (UniqueName: \"kubernetes.io/projected/a46480b6-6c0e-4502-81b7-e1c461ba7fa4-kube-api-access-h2lh4\") pod \"console-operator-58897d9998-qzltz\" (UID: \"a46480b6-6c0e-4502-81b7-e1c461ba7fa4\") " pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.025584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkt6m\" (UniqueName: \"kubernetes.io/projected/b43ea736-a4b5-473d-a3a2-3d779a856a86-kube-api-access-qkt6m\") pod \"route-controller-manager-6576b87f9c-4h99l\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.038407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78cn9\" (UniqueName: \"kubernetes.io/projected/8a947df3-154e-488f-9e1c-4a41cc94553c-kube-api-access-78cn9\") pod \"openshift-config-operator-7777fb866f-jqntr\" (UID: \"8a947df3-154e-488f-9e1c-4a41cc94553c\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.043973 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.067843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5tvx\" (UniqueName: \"kubernetes.io/projected/78b8999d-9535-4584-baa0-5fd38838ac29-kube-api-access-s5tvx\") pod \"machine-api-operator-5694c8668f-thf5c\" (UID: \"78b8999d-9535-4584-baa0-5fd38838ac29\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.087244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrzcj\" (UniqueName: \"kubernetes.io/projected/30208d57-8abd-4956-92a1-1b1aa21b754a-kube-api-access-nrzcj\") pod \"controller-manager-879f6c89f-gg89j\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.090944 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.103156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zgc6\" (UniqueName: \"kubernetes.io/projected/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-kube-api-access-5zgc6\") pod \"oauth-openshift-558db77b4-jxgm7\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.117066 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx9hq\" (UniqueName: \"kubernetes.io/projected/1a3a3427-44f4-4518-861e-f11f5cb76d90-kube-api-access-tx9hq\") pod \"dns-operator-744455d44c-ghzc4\" (UID: \"1a3a3427-44f4-4518-861e-f11f5cb76d90\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.120818 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.129181 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.138915 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zdxsk" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.141323 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.156104 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.160948 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.181275 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.194613 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.198288 4707 request.go:700] Waited for 1.929958619s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.201693 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.207119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" event={"ID":"dad82366-6967-4fba-9d3d-763ba28e9a73","Type":"ContainerStarted","Data":"7df7a178961a47bc88271aaf9b97d5631a1293e1e0861d3857f6e94308a462b6"} Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.224066 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.224839 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.235616 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.240307 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.248544 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.273627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.286357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.303098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcfb\" (UniqueName: \"kubernetes.io/projected/9a65caa3-afbb-4401-916f-fd7a4a3ea46e-kube-api-access-vbcfb\") pod \"cluster-image-registry-operator-dc59b4c8b-rck5n\" (UID: \"9a65caa3-afbb-4401-916f-fd7a4a3ea46e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.322743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4hb\" (UniqueName: \"kubernetes.io/projected/bae1e9aa-742e-4e00-ba4a-2edad6d45e95-kube-api-access-rk4hb\") pod \"package-server-manager-789f6589d5-p28h9\" (UID: \"bae1e9aa-742e-4e00-ba4a-2edad6d45e95\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.326835 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.329874 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-kl4m7"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.335960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dv7g\" (UniqueName: \"kubernetes.io/projected/9744e361-2c35-436c-a453-984cff9d923f-kube-api-access-4dv7g\") pod \"apiserver-76f77b778f-khskd\" (UID: \"9744e361-2c35-436c-a453-984cff9d923f\") " pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.356332 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6nph\" (UniqueName: \"kubernetes.io/projected/9c44f508-25ec-493c-ba35-8c6d4f0cf7ae-kube-api-access-c6nph\") pod \"kube-storage-version-migrator-operator-b67b599dd-khh5f\" (UID: \"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.360957 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-thf5c"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.380559 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae3cd03-ce75-4cc2-95d2-32a63d34ba10-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cj6jg\" (UID: \"eae3cd03-ce75-4cc2-95d2-32a63d34ba10\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.404532 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkclw\" (UniqueName: \"kubernetes.io/projected/21940f54-9029-467c-aaac-7580e44761a4-kube-api-access-xkclw\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhqwm\" (UID: \"21940f54-9029-467c-aaac-7580e44761a4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.417987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dmdnz\" (UID: \"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.439418 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.444561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-749dv\" (UniqueName: \"kubernetes.io/projected/b37b9b2e-05d2-434f-bd01-93cda5a05b52-kube-api-access-749dv\") pod \"router-default-5444994796-tggrz\" (UID: \"b37b9b2e-05d2-434f-bd01-93cda5a05b52\") " pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-certificates\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjhz\" (UniqueName: \"kubernetes.io/projected/36bf60c9-93cb-431f-9df1-1d3e245c49ef-kube-api-access-sqjhz\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473710 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-client\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473728 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-oauth-config\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7a06d3-2413-4179-83b7-db23583f1c6d-serving-cert\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473816 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af50556-1505-45f2-b080-f1484a42f2cd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-ca\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqbc6\" (UniqueName: \"kubernetes.io/projected/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-kube-api-access-vqbc6\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473883 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-trusted-ca-bundle\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473901 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af50556-1505-45f2-b080-f1484a42f2cd-config\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-tls\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-config\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473949 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.473976 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t8dm\" (UniqueName: \"kubernetes.io/projected/e8027394-2524-45df-8cdc-967024215d25-kube-api-access-5t8dm\") pod \"control-plane-machine-set-operator-78cbb6b69f-h746w\" (UID: \"e8027394-2524-45df-8cdc-967024215d25\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp5rn\" (UniqueName: \"kubernetes.io/projected/4bd96a64-1d3d-464d-a386-26a39642ee24-kube-api-access-hp5rn\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-trusted-ca\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474049 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bd96a64-1d3d-464d-a386-26a39642ee24-metrics-tls\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-serving-cert\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-service-ca\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: E1127 16:06:07.474118 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:07.974099048 +0000 UTC m=+143.605547816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpqnb\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-kube-api-access-dpqnb\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4bd96a64-1d3d-464d-a386-26a39642ee24-trusted-ca\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7af50556-1505-45f2-b080-f1484a42f2cd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46gb4\" (UniqueName: \"kubernetes.io/projected/e084e8d6-5275-426c-9be4-c5f4ee49abef-kube-api-access-46gb4\") pod \"migrator-59844c95c7-dlchm\" (UID: \"e084e8d6-5275-426c-9be4-c5f4ee49abef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474323 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-proxy-tls\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474427 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8027394-2524-45df-8cdc-967024215d25-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h746w\" (UID: \"e8027394-2524-45df-8cdc-967024215d25\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-bound-sa-token\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474468 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-service-ca\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.474996 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-oauth-serving-cert\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.475033 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss92v\" (UniqueName: \"kubernetes.io/projected/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-kube-api-access-ss92v\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.475058 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bd96a64-1d3d-464d-a386-26a39642ee24-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.475146 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.475762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-config\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.475802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-srv-cert\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.475844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x679f\" (UniqueName: \"kubernetes.io/projected/5e7a06d3-2413-4179-83b7-db23583f1c6d-kube-api-access-x679f\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.476010 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.486142 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.500119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.510410 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.523681 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.576755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:07 crc kubenswrapper[4707]: E1127 16:06:07.576939 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.076908386 +0000 UTC m=+143.708357154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.576989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss92v\" (UniqueName: \"kubernetes.io/projected/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-kube-api-access-ss92v\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eab64752-efb6-4d5d-89b2-43da02ae599f-proxy-tls\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577083 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn8sg\" (UniqueName: \"kubernetes.io/projected/ce9dffbd-e000-476b-b89b-6208d0506f26-kube-api-access-qn8sg\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577116 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bd96a64-1d3d-464d-a386-26a39642ee24-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577206 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpn4n\" (UniqueName: \"kubernetes.io/projected/eab64752-efb6-4d5d-89b2-43da02ae599f-kube-api-access-kpn4n\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-config\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577258 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-srv-cert\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x679f\" (UniqueName: \"kubernetes.io/projected/5e7a06d3-2413-4179-83b7-db23583f1c6d-kube-api-access-x679f\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577337 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-socket-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-certificates\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90d768c2-2c32-41a1-b661-763e7c027a94-config-volume\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577433 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjhz\" (UniqueName: \"kubernetes.io/projected/36bf60c9-93cb-431f-9df1-1d3e245c49ef-kube-api-access-sqjhz\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.577451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4z89\" (UniqueName: \"kubernetes.io/projected/f55dc6de-bb5d-4221-a670-4b65c3992031-kube-api-access-f4z89\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.578086 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-client\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.578118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-oauth-config\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.578214 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.578714 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-certificates\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.579930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.580027 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.580137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7a06d3-2413-4179-83b7-db23583f1c6d-serving-cert\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: E1127 16:06:07.580339 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.080325113 +0000 UTC m=+143.711773881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.580695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28h8m\" (UniqueName: \"kubernetes.io/projected/789119b0-3180-49e9-8d16-c60f968bf6cf-kube-api-access-28h8m\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.580813 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af50556-1505-45f2-b080-f1484a42f2cd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.581250 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/efcebfb2-d130-4f47-a514-01d1bf5eb567-apiservice-cert\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.581338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/efcebfb2-d130-4f47-a514-01d1bf5eb567-webhook-cert\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.581739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-ca\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.581841 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/789119b0-3180-49e9-8d16-c60f968bf6cf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.581942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqbc6\" (UniqueName: \"kubernetes.io/projected/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-kube-api-access-vqbc6\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.582004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fkvx\" (UniqueName: \"kubernetes.io/projected/2d3126ab-3db9-4fa7-95ef-673a79b2178a-kube-api-access-5fkvx\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.582353 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-trusted-ca-bundle\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.582510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af50556-1505-45f2-b080-f1484a42f2cd-config\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.582566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-node-bootstrap-token\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.582588 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5450b16d-7dca-4ba0-8184-bb0b4a19dc71-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72lr4\" (UID: \"5450b16d-7dca-4ba0-8184-bb0b4a19dc71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.582609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2tbm\" (UniqueName: \"kubernetes.io/projected/efcebfb2-d130-4f47-a514-01d1bf5eb567-kube-api-access-f2tbm\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.582807 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-ca\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.583285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-tls\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.583547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af50556-1505-45f2-b080-f1484a42f2cd-config\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.583316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-config\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjbwn\" (UniqueName: \"kubernetes.io/projected/5450b16d-7dca-4ba0-8184-bb0b4a19dc71-kube-api-access-wjbwn\") pod \"multus-admission-controller-857f4d67dd-72lr4\" (UID: \"5450b16d-7dca-4ba0-8184-bb0b4a19dc71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgcht\" (UniqueName: \"kubernetes.io/projected/33d637be-496f-43c5-bb76-b742f8cc97ac-kube-api-access-zgcht\") pod \"ingress-canary-zgfxc\" (UID: \"33d637be-496f-43c5-bb76-b742f8cc97ac\") " pod="openshift-ingress-canary/ingress-canary-zgfxc" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584180 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t8dm\" (UniqueName: \"kubernetes.io/projected/e8027394-2524-45df-8cdc-967024215d25-kube-api-access-5t8dm\") pod \"control-plane-machine-set-operator-78cbb6b69f-h746w\" (UID: \"e8027394-2524-45df-8cdc-967024215d25\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584213 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-mountpoint-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb6211ec-62fd-4a7a-b220-87ce050b996e-config\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eab64752-efb6-4d5d-89b2-43da02ae599f-images\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584312 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eab64752-efb6-4d5d-89b2-43da02ae599f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584332 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-certs\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp5rn\" (UniqueName: \"kubernetes.io/projected/4bd96a64-1d3d-464d-a386-26a39642ee24-kube-api-access-hp5rn\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584411 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f55dc6de-bb5d-4221-a670-4b65c3992031-secret-volume\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-trusted-ca\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584470 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bd96a64-1d3d-464d-a386-26a39642ee24-metrics-tls\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-serving-cert\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6211ec-62fd-4a7a-b220-87ce050b996e-serving-cert\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d768c2-2c32-41a1-b661-763e7c027a94-metrics-tls\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-service-ca\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz9rc\" (UniqueName: \"kubernetes.io/projected/eb6211ec-62fd-4a7a-b220-87ce050b996e-kube-api-access-hz9rc\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584758 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpqnb\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-kube-api-access-dpqnb\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4bd96a64-1d3d-464d-a386-26a39642ee24-trusted-ca\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584830 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce9dffbd-e000-476b-b89b-6208d0506f26-signing-key\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584846 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce9dffbd-e000-476b-b89b-6208d0506f26-signing-cabundle\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7af50556-1505-45f2-b080-f1484a42f2cd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw6s5\" (UniqueName: \"kubernetes.io/projected/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-kube-api-access-vw6s5\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/789119b0-3180-49e9-8d16-c60f968bf6cf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.584991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46gb4\" (UniqueName: \"kubernetes.io/projected/e084e8d6-5275-426c-9be4-c5f4ee49abef-kube-api-access-46gb4\") pod \"migrator-59844c95c7-dlchm\" (UID: \"e084e8d6-5275-426c-9be4-c5f4ee49abef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-registration-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv22p\" (UniqueName: \"kubernetes.io/projected/24bf08cf-63a0-47ca-be2a-0f38a51109c9-kube-api-access-nv22p\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2d3126ab-3db9-4fa7-95ef-673a79b2178a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f55dc6de-bb5d-4221-a670-4b65c3992031-config-volume\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585170 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-proxy-tls\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-csi-data-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8027394-2524-45df-8cdc-967024215d25-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h746w\" (UID: \"e8027394-2524-45df-8cdc-967024215d25\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-plugins-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-bound-sa-token\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585657 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ptg2\" (UniqueName: \"kubernetes.io/projected/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-kube-api-access-2ptg2\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33d637be-496f-43c5-bb76-b742f8cc97ac-cert\") pod \"ingress-canary-zgfxc\" (UID: \"33d637be-496f-43c5-bb76-b742f8cc97ac\") " pod="openshift-ingress-canary/ingress-canary-zgfxc" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585716 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af50556-1505-45f2-b080-f1484a42f2cd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585725 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-service-ca\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585802 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2d3126ab-3db9-4fa7-95ef-673a79b2178a-srv-cert\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-oauth-serving-cert\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585895 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmjg8\" (UniqueName: \"kubernetes.io/projected/90d768c2-2c32-41a1-b661-763e7c027a94-kube-api-access-xmjg8\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.585969 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/efcebfb2-d130-4f47-a514-01d1bf5eb567-tmpfs\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.586582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.587997 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-config\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.595217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.598572 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-tls\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.598953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-trusted-ca\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.599669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-service-ca\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.600125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-client\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.600966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-oauth-serving-cert\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.601727 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-srv-cert\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.602729 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-proxy-tls\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.602982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.603403 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e7a06d3-2413-4179-83b7-db23583f1c6d-serving-cert\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.605638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e7a06d3-2413-4179-83b7-db23583f1c6d-etcd-service-ca\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.607236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-trusted-ca-bundle\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.610887 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-config\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.615956 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4bd96a64-1d3d-464d-a386-26a39642ee24-trusted-ca\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.620238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bd96a64-1d3d-464d-a386-26a39642ee24-metrics-tls\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.623441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e8027394-2524-45df-8cdc-967024215d25-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h746w\" (UID: \"e8027394-2524-45df-8cdc-967024215d25\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.624743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss92v\" (UniqueName: \"kubernetes.io/projected/5f4261e0-d038-4c38-8a9f-27c22bb95d1c-kube-api-access-ss92v\") pod \"machine-config-controller-84d6567774-6t7cw\" (UID: \"5f4261e0-d038-4c38-8a9f-27c22bb95d1c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.630933 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-serving-cert\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.637781 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-oauth-config\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.639892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x679f\" (UniqueName: \"kubernetes.io/projected/5e7a06d3-2413-4179-83b7-db23583f1c6d-kube-api-access-x679f\") pod \"etcd-operator-b45778765-nhz7q\" (UID: \"5e7a06d3-2413-4179-83b7-db23583f1c6d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.659058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjhz\" (UniqueName: \"kubernetes.io/projected/36bf60c9-93cb-431f-9df1-1d3e245c49ef-kube-api-access-sqjhz\") pod \"console-f9d7485db-5r8tf\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.686466 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bd96a64-1d3d-464d-a386-26a39642ee24-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.686714 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4z89\" (UniqueName: \"kubernetes.io/projected/f55dc6de-bb5d-4221-a670-4b65c3992031-kube-api-access-f4z89\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: E1127 16:06:07.687538 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.187500372 +0000 UTC m=+143.818949140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687717 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28h8m\" (UniqueName: \"kubernetes.io/projected/789119b0-3180-49e9-8d16-c60f968bf6cf-kube-api-access-28h8m\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/efcebfb2-d130-4f47-a514-01d1bf5eb567-apiservice-cert\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/efcebfb2-d130-4f47-a514-01d1bf5eb567-webhook-cert\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687807 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/789119b0-3180-49e9-8d16-c60f968bf6cf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fkvx\" (UniqueName: \"kubernetes.io/projected/2d3126ab-3db9-4fa7-95ef-673a79b2178a-kube-api-access-5fkvx\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687868 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-node-bootstrap-token\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5450b16d-7dca-4ba0-8184-bb0b4a19dc71-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72lr4\" (UID: \"5450b16d-7dca-4ba0-8184-bb0b4a19dc71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2tbm\" (UniqueName: \"kubernetes.io/projected/efcebfb2-d130-4f47-a514-01d1bf5eb567-kube-api-access-f2tbm\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjbwn\" (UniqueName: \"kubernetes.io/projected/5450b16d-7dca-4ba0-8184-bb0b4a19dc71-kube-api-access-wjbwn\") pod \"multus-admission-controller-857f4d67dd-72lr4\" (UID: \"5450b16d-7dca-4ba0-8184-bb0b4a19dc71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.687952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgcht\" (UniqueName: \"kubernetes.io/projected/33d637be-496f-43c5-bb76-b742f8cc97ac-kube-api-access-zgcht\") pod \"ingress-canary-zgfxc\" (UID: \"33d637be-496f-43c5-bb76-b742f8cc97ac\") " pod="openshift-ingress-canary/ingress-canary-zgfxc" Nov 27 16:06:07 crc kubenswrapper[4707]: E1127 16:06:07.687985 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.187965694 +0000 UTC m=+143.819414462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-mountpoint-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb6211ec-62fd-4a7a-b220-87ce050b996e-config\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688090 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eab64752-efb6-4d5d-89b2-43da02ae599f-images\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eab64752-efb6-4d5d-89b2-43da02ae599f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-certs\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f55dc6de-bb5d-4221-a670-4b65c3992031-secret-volume\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688239 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6211ec-62fd-4a7a-b220-87ce050b996e-serving-cert\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d768c2-2c32-41a1-b661-763e7c027a94-metrics-tls\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz9rc\" (UniqueName: \"kubernetes.io/projected/eb6211ec-62fd-4a7a-b220-87ce050b996e-kube-api-access-hz9rc\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce9dffbd-e000-476b-b89b-6208d0506f26-signing-key\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688443 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce9dffbd-e000-476b-b89b-6208d0506f26-signing-cabundle\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw6s5\" (UniqueName: \"kubernetes.io/projected/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-kube-api-access-vw6s5\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688506 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/789119b0-3180-49e9-8d16-c60f968bf6cf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688530 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-registration-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv22p\" (UniqueName: \"kubernetes.io/projected/24bf08cf-63a0-47ca-be2a-0f38a51109c9-kube-api-access-nv22p\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688605 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2d3126ab-3db9-4fa7-95ef-673a79b2178a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-csi-data-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f55dc6de-bb5d-4221-a670-4b65c3992031-config-volume\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-plugins-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688709 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ptg2\" (UniqueName: \"kubernetes.io/projected/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-kube-api-access-2ptg2\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33d637be-496f-43c5-bb76-b742f8cc97ac-cert\") pod \"ingress-canary-zgfxc\" (UID: \"33d637be-496f-43c5-bb76-b742f8cc97ac\") " pod="openshift-ingress-canary/ingress-canary-zgfxc" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2d3126ab-3db9-4fa7-95ef-673a79b2178a-srv-cert\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmjg8\" (UniqueName: \"kubernetes.io/projected/90d768c2-2c32-41a1-b661-763e7c027a94-kube-api-access-xmjg8\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688787 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/efcebfb2-d130-4f47-a514-01d1bf5eb567-tmpfs\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688810 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eab64752-efb6-4d5d-89b2-43da02ae599f-proxy-tls\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn8sg\" (UniqueName: \"kubernetes.io/projected/ce9dffbd-e000-476b-b89b-6208d0506f26-kube-api-access-qn8sg\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpn4n\" (UniqueName: \"kubernetes.io/projected/eab64752-efb6-4d5d-89b2-43da02ae599f-kube-api-access-kpn4n\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-socket-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.688942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90d768c2-2c32-41a1-b661-763e7c027a94-config-volume\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.690472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.694524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90d768c2-2c32-41a1-b661-763e7c027a94-config-volume\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.694756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-registration-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.699242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-csi-data-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.700190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f55dc6de-bb5d-4221-a670-4b65c3992031-config-volume\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.700225 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zdxsk"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.700284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-plugins-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.700398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2d3126ab-3db9-4fa7-95ef-673a79b2178a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.700473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/efcebfb2-d130-4f47-a514-01d1bf5eb567-tmpfs\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.702121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.702634 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-socket-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.704137 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/24bf08cf-63a0-47ca-be2a-0f38a51109c9-mountpoint-dir\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.710430 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2d3126ab-3db9-4fa7-95ef-673a79b2178a-srv-cert\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.711548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb6211ec-62fd-4a7a-b220-87ce050b996e-config\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.711946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/789119b0-3180-49e9-8d16-c60f968bf6cf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.712369 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/efcebfb2-d130-4f47-a514-01d1bf5eb567-apiservice-cert\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.712440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33d637be-496f-43c5-bb76-b742f8cc97ac-cert\") pod \"ingress-canary-zgfxc\" (UID: \"33d637be-496f-43c5-bb76-b742f8cc97ac\") " pod="openshift-ingress-canary/ingress-canary-zgfxc" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.712468 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce9dffbd-e000-476b-b89b-6208d0506f26-signing-cabundle\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.712871 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.712943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce9dffbd-e000-476b-b89b-6208d0506f26-signing-key\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.713874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eab64752-efb6-4d5d-89b2-43da02ae599f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.714256 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eab64752-efb6-4d5d-89b2-43da02ae599f-images\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.714562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.716211 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqbc6\" (UniqueName: \"kubernetes.io/projected/9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d-kube-api-access-vqbc6\") pod \"catalog-operator-68c6474976-6m8wz\" (UID: \"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.716432 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5450b16d-7dca-4ba0-8184-bb0b4a19dc71-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72lr4\" (UID: \"5450b16d-7dca-4ba0-8184-bb0b4a19dc71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.716564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6211ec-62fd-4a7a-b220-87ce050b996e-serving-cert\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.716773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-node-bootstrap-token\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.717627 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-certs\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.723257 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/789119b0-3180-49e9-8d16-c60f968bf6cf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.723984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d768c2-2c32-41a1-b661-763e7c027a94-metrics-tls\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.724417 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eab64752-efb6-4d5d-89b2-43da02ae599f-proxy-tls\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.724501 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/efcebfb2-d130-4f47-a514-01d1bf5eb567-webhook-cert\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.725110 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.728004 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp5rn\" (UniqueName: \"kubernetes.io/projected/4bd96a64-1d3d-464d-a386-26a39642ee24-kube-api-access-hp5rn\") pod \"ingress-operator-5b745b69d9-pfqjd\" (UID: \"4bd96a64-1d3d-464d-a386-26a39642ee24\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.735946 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f55dc6de-bb5d-4221-a670-4b65c3992031-secret-volume\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.740568 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jxgm7"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.749641 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t8dm\" (UniqueName: \"kubernetes.io/projected/e8027394-2524-45df-8cdc-967024215d25-kube-api-access-5t8dm\") pod \"control-plane-machine-set-operator-78cbb6b69f-h746w\" (UID: \"e8027394-2524-45df-8cdc-967024215d25\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.760480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46gb4\" (UniqueName: \"kubernetes.io/projected/e084e8d6-5275-426c-9be4-c5f4ee49abef-kube-api-access-46gb4\") pod \"migrator-59844c95c7-dlchm\" (UID: \"e084e8d6-5275-426c-9be4-c5f4ee49abef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.770686 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" Nov 27 16:06:07 crc kubenswrapper[4707]: W1127 16:06:07.772960 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37b9b2e_05d2_434f_bd01_93cda5a05b52.slice/crio-9c3732d2fafbdc49aaf136740d05dd8606378bb523165303968fddf2b091678a WatchSource:0}: Error finding container 9c3732d2fafbdc49aaf136740d05dd8606378bb523165303968fddf2b091678a: Status 404 returned error can't find the container with id 9c3732d2fafbdc49aaf136740d05dd8606378bb523165303968fddf2b091678a Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.778971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-bound-sa-token\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.788694 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qzltz"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.790349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:07 crc kubenswrapper[4707]: E1127 16:06:07.790967 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.290948316 +0000 UTC m=+143.922397084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.791289 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gg89j"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.792281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.795156 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.817585 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7af50556-1505-45f2-b080-f1484a42f2cd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7k4zd\" (UID: \"7af50556-1505-45f2-b080-f1484a42f2cd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.830842 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.838581 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.840973 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpqnb\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-kube-api-access-dpqnb\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: W1127 16:06:07.851153 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50e5ce2e_3776_4890_8b73_b0ae2b2d0237.slice/crio-92e4ea5f1c92e6968382d40db9c50b98048a0e48b25d715a536009880a0b827b WatchSource:0}: Error finding container 92e4ea5f1c92e6968382d40db9c50b98048a0e48b25d715a536009880a0b827b: Status 404 returned error can't find the container with id 92e4ea5f1c92e6968382d40db9c50b98048a0e48b25d715a536009880a0b827b Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.861903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4z89\" (UniqueName: \"kubernetes.io/projected/f55dc6de-bb5d-4221-a670-4b65c3992031-kube-api-access-f4z89\") pod \"collect-profiles-29404320-sxj55\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.882728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgcht\" (UniqueName: \"kubernetes.io/projected/33d637be-496f-43c5-bb76-b742f8cc97ac-kube-api-access-zgcht\") pod \"ingress-canary-zgfxc\" (UID: \"33d637be-496f-43c5-bb76-b742f8cc97ac\") " pod="openshift-ingress-canary/ingress-canary-zgfxc" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.895536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:07 crc kubenswrapper[4707]: E1127 16:06:07.895963 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.395947439 +0000 UTC m=+144.027396217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.905185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28h8m\" (UniqueName: \"kubernetes.io/projected/789119b0-3180-49e9-8d16-c60f968bf6cf-kube-api-access-28h8m\") pod \"openshift-apiserver-operator-796bbdcf4f-74mzf\" (UID: \"789119b0-3180-49e9-8d16-c60f968bf6cf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.916237 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jqntr"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.940465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv22p\" (UniqueName: \"kubernetes.io/projected/24bf08cf-63a0-47ca-be2a-0f38a51109c9-kube-api-access-nv22p\") pod \"csi-hostpathplugin-slj9l\" (UID: \"24bf08cf-63a0-47ca-be2a-0f38a51109c9\") " pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.940934 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zgfxc" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.942621 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.945106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ghzc4"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.948124 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmjg8\" (UniqueName: \"kubernetes.io/projected/90d768c2-2c32-41a1-b661-763e7c027a94-kube-api-access-xmjg8\") pod \"dns-default-zg2ss\" (UID: \"90d768c2-2c32-41a1-b661-763e7c027a94\") " pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.960883 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.961449 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l"] Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.967346 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ptg2\" (UniqueName: \"kubernetes.io/projected/4fb5c374-18c5-433f-bb2c-f03d8eea8de6-kube-api-access-2ptg2\") pod \"machine-config-server-85rvh\" (UID: \"4fb5c374-18c5-433f-bb2c-f03d8eea8de6\") " pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.977311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn8sg\" (UniqueName: \"kubernetes.io/projected/ce9dffbd-e000-476b-b89b-6208d0506f26-kube-api-access-qn8sg\") pod \"service-ca-9c57cc56f-d8nfn\" (UID: \"ce9dffbd-e000-476b-b89b-6208d0506f26\") " pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.996333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:07 crc kubenswrapper[4707]: E1127 16:06:07.997084 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.496873797 +0000 UTC m=+144.128322565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:07 crc kubenswrapper[4707]: I1127 16:06:07.998646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fkvx\" (UniqueName: \"kubernetes.io/projected/2d3126ab-3db9-4fa7-95ef-673a79b2178a-kube-api-access-5fkvx\") pod \"olm-operator-6b444d44fb-49hfs\" (UID: \"2d3126ab-3db9-4fa7-95ef-673a79b2178a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.002343 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.018395 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpn4n\" (UniqueName: \"kubernetes.io/projected/eab64752-efb6-4d5d-89b2-43da02ae599f-kube-api-access-kpn4n\") pod \"machine-config-operator-74547568cd-lrj4r\" (UID: \"eab64752-efb6-4d5d-89b2-43da02ae599f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.032633 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.040009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2tbm\" (UniqueName: \"kubernetes.io/projected/efcebfb2-d130-4f47-a514-01d1bf5eb567-kube-api-access-f2tbm\") pod \"packageserver-d55dfcdfc-h4m82\" (UID: \"efcebfb2-d130-4f47-a514-01d1bf5eb567\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.067240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjbwn\" (UniqueName: \"kubernetes.io/projected/5450b16d-7dca-4ba0-8184-bb0b4a19dc71-kube-api-access-wjbwn\") pod \"multus-admission-controller-857f4d67dd-72lr4\" (UID: \"5450b16d-7dca-4ba0-8184-bb0b4a19dc71\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.083770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz9rc\" (UniqueName: \"kubernetes.io/projected/eb6211ec-62fd-4a7a-b220-87ce050b996e-kube-api-access-hz9rc\") pod \"service-ca-operator-777779d784-xbpq4\" (UID: \"eb6211ec-62fd-4a7a-b220-87ce050b996e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.098394 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.098859 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.598843213 +0000 UTC m=+144.230291981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.101563 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw6s5\" (UniqueName: \"kubernetes.io/projected/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-kube-api-access-vw6s5\") pod \"marketplace-operator-79b997595-jtqhs\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.103116 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.109380 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.115861 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.128194 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nhz7q"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.147094 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.149294 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.149612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.157936 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.164675 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.169766 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.176091 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.182183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.193841 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.194287 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.194610 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.201993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.202352 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.702335817 +0000 UTC m=+144.333784585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: W1127 16:06:08.205738 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c44f508_25ec_493c_ba35_8c6d4f0cf7ae.slice/crio-140886cc3a9ea2170d4ed6ed33520827314549a786fdb9fb6468ae15b304b589 WatchSource:0}: Error finding container 140886cc3a9ea2170d4ed6ed33520827314549a786fdb9fb6468ae15b304b589: Status 404 returned error can't find the container with id 140886cc3a9ea2170d4ed6ed33520827314549a786fdb9fb6468ae15b304b589 Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.216859 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-slj9l" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.222399 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-85rvh" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.248064 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.270409 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-khskd"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.271921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qzltz" event={"ID":"a46480b6-6c0e-4502-81b7-e1c461ba7fa4","Type":"ContainerStarted","Data":"3f259aef3249e514e019c2e964c75b2529b6d8d470006ab933d67819ec0fa2d3"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.271988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qzltz" event={"ID":"a46480b6-6c0e-4502-81b7-e1c461ba7fa4","Type":"ContainerStarted","Data":"00192efd83a2c7bef7152e64fb51d144a62f224baa4a3c0c8b58504b77e0e447"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.273653 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.275884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" event={"ID":"1a3a3427-44f4-4518-861e-f11f5cb76d90","Type":"ContainerStarted","Data":"42b29c422f136823ce347fba2e0f14f599772da4fbaca728f06c220984c877b8"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.283275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tggrz" event={"ID":"b37b9b2e-05d2-434f-bd01-93cda5a05b52","Type":"ContainerStarted","Data":"edd5a7c6ba93a08cf0d6f661111a8f0ca5b96ee7d09bd3c37b961047becfd19d"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.283319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tggrz" event={"ID":"b37b9b2e-05d2-434f-bd01-93cda5a05b52","Type":"ContainerStarted","Data":"9c3732d2fafbdc49aaf136740d05dd8606378bb523165303968fddf2b091678a"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.293491 4707 patch_prober.go:28] interesting pod/console-operator-58897d9998-qzltz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.293594 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qzltz" podUID="a46480b6-6c0e-4502-81b7-e1c461ba7fa4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 27 16:06:08 crc kubenswrapper[4707]: W1127 16:06:08.297269 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode084e8d6_5275_426c_9be4_c5f4ee49abef.slice/crio-520c3e321bb26c5c57289bbb3d760e2a354ef59ace81eda68eff37e0995e79c6 WatchSource:0}: Error finding container 520c3e321bb26c5c57289bbb3d760e2a354ef59ace81eda68eff37e0995e79c6: Status 404 returned error can't find the container with id 520c3e321bb26c5c57289bbb3d760e2a354ef59ace81eda68eff37e0995e79c6 Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.303962 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.305899 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.805878543 +0000 UTC m=+144.437327301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.309257 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.310184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zdxsk" event={"ID":"50e5ce2e-3776-4890-8b73-b0ae2b2d0237","Type":"ContainerStarted","Data":"92e4ea5f1c92e6968382d40db9c50b98048a0e48b25d715a536009880a0b827b"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.312611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.319889 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.326659 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" event={"ID":"dad82366-6967-4fba-9d3d-763ba28e9a73","Type":"ContainerStarted","Data":"9a30b7c7f634eb7f36fe6e8d01cfcff7105f302f634ddd934b3c6e0ea6ae3387"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.326691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" event={"ID":"dad82366-6967-4fba-9d3d-763ba28e9a73","Type":"ContainerStarted","Data":"300a0980b03c78fd05f993b22579fbe0a2e50e231854c03e3b4531c7ba0b56ae"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.333741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" event={"ID":"eae3cd03-ce75-4cc2-95d2-32a63d34ba10","Type":"ContainerStarted","Data":"b320470309bf1ebf453f9c496e424c7f6dbb70b0c3d0d2669a94ce43a88f6f99"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.343316 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" event={"ID":"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf","Type":"ContainerStarted","Data":"8d98173d610ce56df91ded86fe8e461d5e44b6d95c467d0f07607215081f4dcd"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.344783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" event={"ID":"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae","Type":"ContainerStarted","Data":"140886cc3a9ea2170d4ed6ed33520827314549a786fdb9fb6468ae15b304b589"} Nov 27 16:06:08 crc kubenswrapper[4707]: W1127 16:06:08.347585 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9744e361_2c35_436c_a453_984cff9d923f.slice/crio-d6d341afd35797f479aad73ebcd2244fa7b6e30a54c78b6f928c997d41d50e91 WatchSource:0}: Error finding container d6d341afd35797f479aad73ebcd2244fa7b6e30a54c78b6f928c997d41d50e91: Status 404 returned error can't find the container with id d6d341afd35797f479aad73ebcd2244fa7b6e30a54c78b6f928c997d41d50e91 Nov 27 16:06:08 crc kubenswrapper[4707]: W1127 16:06:08.352001 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba92c9fa_a7ba_4ae5_87be_4d0ded13bba1.slice/crio-b8ac8509a924af0948e8ca3e774f4a625ee5ff3646398f629633def0b7be6ec6 WatchSource:0}: Error finding container b8ac8509a924af0948e8ca3e774f4a625ee5ff3646398f629633def0b7be6ec6: Status 404 returned error can't find the container with id b8ac8509a924af0948e8ca3e774f4a625ee5ff3646398f629633def0b7be6ec6 Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.361790 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" event={"ID":"78b8999d-9535-4584-baa0-5fd38838ac29","Type":"ContainerStarted","Data":"c7b34cdc03e94808f59c5ec4ffea0087550e759a5efc40b872813d887f7041d2"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.362012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" event={"ID":"78b8999d-9535-4584-baa0-5fd38838ac29","Type":"ContainerStarted","Data":"cf4163b27dfe243494f88d51540a34235ede16a4fd35d2d0c10c3f5dff74e207"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.362025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" event={"ID":"78b8999d-9535-4584-baa0-5fd38838ac29","Type":"ContainerStarted","Data":"bbc9e90cb2840a0a4b00f3a7d120927b34af2ec92fe3a66aebbb92634ef18abc"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.364174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" event={"ID":"8a947df3-154e-488f-9e1c-4a41cc94553c","Type":"ContainerStarted","Data":"03e97a1852939ee46134017fcb771666e82957cdb6945880b467c35dc578abca"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.368851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" event={"ID":"30208d57-8abd-4956-92a1-1b1aa21b754a","Type":"ContainerStarted","Data":"941d944101a47580f4cdbac8a54a1f585ed7037f86417f59ebe45067f09ab75a"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.370116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" event={"ID":"8f007d8e-df31-4bfd-8879-54a84ed4b62d","Type":"ContainerStarted","Data":"e3b71936af39c206f4762aa7549f732bc3d6d20d71cf137529eedaa546b2181f"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.370144 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" event={"ID":"8f007d8e-df31-4bfd-8879-54a84ed4b62d","Type":"ContainerStarted","Data":"de4c758325413e2dbc47d7a2fb6fa7800751151838a90347d8d4d7a3abc54753"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.389506 4707 generic.go:334] "Generic (PLEG): container finished" podID="b735e79c-0093-4887-b27a-6e333d9d80a5" containerID="a5bbd833ed9b672960ce50279ecaac439d9de86a8a75b21d8de451b7a88866b4" exitCode=0 Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.389651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" event={"ID":"b735e79c-0093-4887-b27a-6e333d9d80a5","Type":"ContainerDied","Data":"a5bbd833ed9b672960ce50279ecaac439d9de86a8a75b21d8de451b7a88866b4"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.389693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" event={"ID":"b735e79c-0093-4887-b27a-6e333d9d80a5","Type":"ContainerStarted","Data":"916515b8182265c179836b7e0ba273078570bb1248b435d0271a8a6866c2c585"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.398072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" event={"ID":"ecc19675-2c51-4a55-b870-7906093e3de2","Type":"ContainerStarted","Data":"ce391bc9d882b0bf1fcf0ca16c139233f15b4c64e8db93a8ef1d6b84c4373fdc"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.401301 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" event={"ID":"9a65caa3-afbb-4401-916f-fd7a4a3ea46e","Type":"ContainerStarted","Data":"4f895adf9a35e772c050d6a8696ccfbc8e2504b2c5026d538ce6c28f436070c7"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.404684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.407172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" event={"ID":"b43ea736-a4b5-473d-a3a2-3d779a856a86","Type":"ContainerStarted","Data":"31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d"} Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.407219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" event={"ID":"b43ea736-a4b5-473d-a3a2-3d779a856a86","Type":"ContainerStarted","Data":"d5df3df523ed7d1b1141340fee7f04d386e882c4300f8d0af10dc3d004c7e88e"} Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.408208 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:08.908189308 +0000 UTC m=+144.539638076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.409423 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.411412 4707 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4h99l container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.411452 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" podUID="b43ea736-a4b5-473d-a3a2-3d779a856a86" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.474434 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zgfxc"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.486131 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.506337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.509651 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.00962984 +0000 UTC m=+144.641078608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.556957 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5r8tf"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.607085 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.607513 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.107493341 +0000 UTC m=+144.738942109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.620043 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w"] Nov 27 16:06:08 crc kubenswrapper[4707]: W1127 16:06:08.649250 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d637be_496f_43c5_bb76_b742f8cc97ac.slice/crio-881ed53b0db3b391d9ed684cb8636ca3e26b4f5496c159d41ad395bba6bd5be7 WatchSource:0}: Error finding container 881ed53b0db3b391d9ed684cb8636ca3e26b4f5496c159d41ad395bba6bd5be7: Status 404 returned error can't find the container with id 881ed53b0db3b391d9ed684cb8636ca3e26b4f5496c159d41ad395bba6bd5be7 Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.709207 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.709650 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.209633341 +0000 UTC m=+144.841082109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.727844 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.730300 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.730332 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.769890 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd"] Nov 27 16:06:08 crc kubenswrapper[4707]: W1127 16:06:08.776567 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8027394_2524_45df_8cdc_967024215d25.slice/crio-51e800723b2168240fb6078eedbc5303915e25186634d4bc1e34cc9f68d6e7cf WatchSource:0}: Error finding container 51e800723b2168240fb6078eedbc5303915e25186634d4bc1e34cc9f68d6e7cf: Status 404 returned error can't find the container with id 51e800723b2168240fb6078eedbc5303915e25186634d4bc1e34cc9f68d6e7cf Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.816017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.816442 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.31642468 +0000 UTC m=+144.947873448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.844546 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zg2ss"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.875833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4"] Nov 27 16:06:08 crc kubenswrapper[4707]: I1127 16:06:08.920695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:08 crc kubenswrapper[4707]: E1127 16:06:08.921094 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.421079364 +0000 UTC m=+145.052528132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:08 crc kubenswrapper[4707]: W1127 16:06:08.924597 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af50556_1505_45f2_b080_f1484a42f2cd.slice/crio-4c85d6949064cb98d33f0a875068a024a0085fa191789e83b0e2e2503227feaf WatchSource:0}: Error finding container 4c85d6949064cb98d33f0a875068a024a0085fa191789e83b0e2e2503227feaf: Status 404 returned error can't find the container with id 4c85d6949064cb98d33f0a875068a024a0085fa191789e83b0e2e2503227feaf Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.025882 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.027740 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.527712179 +0000 UTC m=+145.159160947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.130418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.130765 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.630750782 +0000 UTC m=+145.262199550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.145943 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-slj9l"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.162720 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xfj8l" podStartSLOduration=124.162680058 podStartE2EDuration="2m4.162680058s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:09.154986862 +0000 UTC m=+144.786435630" watchObservedRunningTime="2025-11-27 16:06:09.162680058 +0000 UTC m=+144.794128826" Nov 27 16:06:09 crc kubenswrapper[4707]: W1127 16:06:09.180878 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d768c2_2c32_41a1_b661_763e7c027a94.slice/crio-e26806f6d8d993acc218866249cca74a4153c351649e8e66ec191bee94e6db44 WatchSource:0}: Error finding container e26806f6d8d993acc218866249cca74a4153c351649e8e66ec191bee94e6db44: Status 404 returned error can't find the container with id e26806f6d8d993acc218866249cca74a4153c351649e8e66ec191bee94e6db44 Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.233676 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.234558 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.734537625 +0000 UTC m=+145.365986393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.325636 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-kl4m7" podStartSLOduration=124.325608492 podStartE2EDuration="2m4.325608492s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:09.318240544 +0000 UTC m=+144.949689312" watchObservedRunningTime="2025-11-27 16:06:09.325608492 +0000 UTC m=+144.957057260" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.336865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.337354 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.837339392 +0000 UTC m=+145.468788160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: W1127 16:06:09.353869 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24bf08cf_63a0_47ca_be2a_0f38a51109c9.slice/crio-a463d2c857347815e19679b7e260978f740fb157c5f0ccc63bf2b9b41280ed53 WatchSource:0}: Error finding container a463d2c857347815e19679b7e260978f740fb157c5f0ccc63bf2b9b41280ed53: Status 404 returned error can't find the container with id a463d2c857347815e19679b7e260978f740fb157c5f0ccc63bf2b9b41280ed53 Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.398987 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.440621 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.440944 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:09.940922739 +0000 UTC m=+145.572371507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.466252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zgfxc" event={"ID":"33d637be-496f-43c5-bb76-b742f8cc97ac","Type":"ContainerStarted","Data":"881ed53b0db3b391d9ed684cb8636ca3e26b4f5496c159d41ad395bba6bd5be7"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.490127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" event={"ID":"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d","Type":"ContainerStarted","Data":"86ee3a960383a3d658a02aa381bf86b247fef672f878a6d7558f1d5868f2cbfe"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.490178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" event={"ID":"9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d","Type":"ContainerStarted","Data":"15eb6b01feb61bcb6c203920ebcbbd45726265fb4f46ba6ccfe9bd3c1a2afe9f"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.504119 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.505523 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.523416 4707 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-6m8wz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.524002 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" podUID="9c2fdb4e-b99d-4dc6-b5a0-5f5c38954e7d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.542747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" event={"ID":"bae1e9aa-742e-4e00-ba4a-2edad6d45e95","Type":"ContainerStarted","Data":"acc0d8552da626daa67b4ef0e5dcdba96fffd65265783b9aba0c1871c05b9daa"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.542817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" event={"ID":"bae1e9aa-742e-4e00-ba4a-2edad6d45e95","Type":"ContainerStarted","Data":"591de5b291c5d8ebace731c2466c75f859ffa0cf494b84abc43eafaec297ef21"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.544053 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jtqhs"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.544753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.545337 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.045321197 +0000 UTC m=+145.676769965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.549419 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-slj9l" event={"ID":"24bf08cf-63a0-47ca-be2a-0f38a51109c9","Type":"ContainerStarted","Data":"a463d2c857347815e19679b7e260978f740fb157c5f0ccc63bf2b9b41280ed53"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.560058 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" event={"ID":"7af50556-1505-45f2-b080-f1484a42f2cd","Type":"ContainerStarted","Data":"4c85d6949064cb98d33f0a875068a024a0085fa191789e83b0e2e2503227feaf"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.590002 4707 generic.go:334] "Generic (PLEG): container finished" podID="8a947df3-154e-488f-9e1c-4a41cc94553c" containerID="13b949b5c9a2bd0c735eeb2fcfd2514cc6b0077d7ac96d72bdeea2586b2a4454" exitCode=0 Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.590355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" event={"ID":"8a947df3-154e-488f-9e1c-4a41cc94553c","Type":"ContainerDied","Data":"13b949b5c9a2bd0c735eeb2fcfd2514cc6b0077d7ac96d72bdeea2586b2a4454"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.614760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" event={"ID":"5e7a06d3-2413-4179-83b7-db23583f1c6d","Type":"ContainerStarted","Data":"38a6257be69794e7f983f1780c7ed2ede8c7ea951ec5a0fcf8d36c8edaafd92b"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.622672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" event={"ID":"ecc19675-2c51-4a55-b870-7906093e3de2","Type":"ContainerStarted","Data":"25095def84a776af9a4f804eca41e7301d8695f16ab9272179ce8699376027c3"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.626908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zdxsk" event={"ID":"50e5ce2e-3776-4890-8b73-b0ae2b2d0237","Type":"ContainerStarted","Data":"565ab4a469bad63d4cc0956133a594ee6e911344f127054d6e16b9735f03ec1e"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.628216 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zdxsk" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.642409 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-zdxsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.642466 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zdxsk" podUID="50e5ce2e-3776-4890-8b73-b0ae2b2d0237" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.642531 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.649024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.650622 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.150603967 +0000 UTC m=+145.782052735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.651133 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.659377 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" event={"ID":"eae3cd03-ce75-4cc2-95d2-32a63d34ba10","Type":"ContainerStarted","Data":"c39dd83210f01e599cab0d997c1f4248c61fc9996a105fbc761b1548b8a361df"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.660508 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qzltz" podStartSLOduration=124.66049504 podStartE2EDuration="2m4.66049504s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:09.643194548 +0000 UTC m=+145.274643316" watchObservedRunningTime="2025-11-27 16:06:09.66049504 +0000 UTC m=+145.291943808" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.661165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-85rvh" event={"ID":"4fb5c374-18c5-433f-bb2c-f03d8eea8de6","Type":"ContainerStarted","Data":"220fb3987f456577b6d63d64f539827fe3f6ce04a19e8d524e40d499354f984c"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.670287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" event={"ID":"4bd96a64-1d3d-464d-a386-26a39642ee24","Type":"ContainerStarted","Data":"180548e8c80152034ecd8d619b4d03e70e4ffcc6a8d5cfe2e58aa7f12b7730e0"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.684407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" event={"ID":"eb6211ec-62fd-4a7a-b220-87ce050b996e","Type":"ContainerStarted","Data":"b0a2d93d5a855aae94ba9855925eacd1de727a049fc1b2eb502e4fb867cf5768"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.689710 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tggrz" podStartSLOduration=124.689685776 podStartE2EDuration="2m4.689685776s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:09.679407993 +0000 UTC m=+145.310856761" watchObservedRunningTime="2025-11-27 16:06:09.689685776 +0000 UTC m=+145.321134544" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.690211 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d8nfn"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.690244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.700028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zg2ss" event={"ID":"90d768c2-2c32-41a1-b661-763e7c027a94","Type":"ContainerStarted","Data":"e26806f6d8d993acc218866249cca74a4153c351649e8e66ec191bee94e6db44"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.712236 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-khskd" event={"ID":"9744e361-2c35-436c-a453-984cff9d923f","Type":"ContainerStarted","Data":"d6d341afd35797f479aad73ebcd2244fa7b6e30a54c78b6f928c997d41d50e91"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.714289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" event={"ID":"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1","Type":"ContainerStarted","Data":"b8ac8509a924af0948e8ca3e774f4a625ee5ff3646398f629633def0b7be6ec6"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.718032 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" event={"ID":"5f4261e0-d038-4c38-8a9f-27c22bb95d1c","Type":"ContainerStarted","Data":"ddbdc4948f59c9881666738e754ed3e0d0a7c251f0674d99bf14e431a5de2a9c"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.724717 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72lr4"] Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.736611 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:09 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:09 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:09 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.737009 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.750471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.752638 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.252625644 +0000 UTC m=+145.884074402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.759810 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" event={"ID":"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf","Type":"ContainerStarted","Data":"1b2502aea9b75734859c51be22c13dff311965d64b0a82b9d83678094b25fb90"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.760868 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:09 crc kubenswrapper[4707]: W1127 16:06:09.765547 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeab64752_efb6_4d5d_89b2_43da02ae599f.slice/crio-bb932d0ccf0fefe4aac0129f76e5a01f2f1a37532ecb70d063894395058470fd WatchSource:0}: Error finding container bb932d0ccf0fefe4aac0129f76e5a01f2f1a37532ecb70d063894395058470fd: Status 404 returned error can't find the container with id bb932d0ccf0fefe4aac0129f76e5a01f2f1a37532ecb70d063894395058470fd Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.769862 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-thf5c" podStartSLOduration=124.769849044 podStartE2EDuration="2m4.769849044s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:09.7669391 +0000 UTC m=+145.398387868" watchObservedRunningTime="2025-11-27 16:06:09.769849044 +0000 UTC m=+145.401297812" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.774933 4707 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jxgm7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.774975 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" podUID="3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.778199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" event={"ID":"21940f54-9029-467c-aaac-7580e44761a4","Type":"ContainerStarted","Data":"46911c8384b1cccdc3c377470a9467f5108465bb39ce60fff9a3d0deaf31058c"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.778229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" event={"ID":"21940f54-9029-467c-aaac-7580e44761a4","Type":"ContainerStarted","Data":"0fda51924e422063eb6b118226fa429cf27f8cee02d4fac5f9d79b127458722d"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.786329 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" event={"ID":"30208d57-8abd-4956-92a1-1b1aa21b754a","Type":"ContainerStarted","Data":"5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.787409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.808706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" event={"ID":"e8027394-2524-45df-8cdc-967024215d25","Type":"ContainerStarted","Data":"51e800723b2168240fb6078eedbc5303915e25186634d4bc1e34cc9f68d6e7cf"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.812152 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" podStartSLOduration=123.812103334 podStartE2EDuration="2m3.812103334s" podCreationTimestamp="2025-11-27 16:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:09.809487907 +0000 UTC m=+145.440936675" watchObservedRunningTime="2025-11-27 16:06:09.812103334 +0000 UTC m=+145.443552092" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.813828 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.827642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" event={"ID":"9a65caa3-afbb-4401-916f-fd7a4a3ea46e","Type":"ContainerStarted","Data":"859e9054a953212509a20702a6c65a7e0104c0d3782fcae88307e8c73f921f73"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.830417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" event={"ID":"1a3a3427-44f4-4518-861e-f11f5cb76d90","Type":"ContainerStarted","Data":"89d9252138f27533444d16179459c1f705ec9756394f37615ab29d9a790fa1d1"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.851639 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.852656 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.35263727 +0000 UTC m=+145.984086028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.854645 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5r8tf" event={"ID":"36bf60c9-93cb-431f-9df1-1d3e245c49ef","Type":"ContainerStarted","Data":"dfe0d7511f231419b5c61009b684627e9e719a6dbddfe2c4f90121d653871f84"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.866221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" event={"ID":"e084e8d6-5275-426c-9be4-c5f4ee49abef","Type":"ContainerStarted","Data":"520c3e321bb26c5c57289bbb3d760e2a354ef59ace81eda68eff37e0995e79c6"} Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.873714 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:06:09 crc kubenswrapper[4707]: I1127 16:06:09.956431 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:09 crc kubenswrapper[4707]: E1127 16:06:09.959832 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.459818139 +0000 UTC m=+146.091266907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.062454 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.086332 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qzltz" Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.087181 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.587155633 +0000 UTC m=+146.218604401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.087669 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.088006 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.587995504 +0000 UTC m=+146.219444262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.156981 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" podStartSLOduration=125.156961737 podStartE2EDuration="2m5.156961737s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:10.126446767 +0000 UTC m=+145.757895535" watchObservedRunningTime="2025-11-27 16:06:10.156961737 +0000 UTC m=+145.788410505" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.195474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.196269 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.696245051 +0000 UTC m=+146.327693819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.197018 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.197451 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.697441811 +0000 UTC m=+146.328890579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.237510 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rck5n" podStartSLOduration=125.237494365 podStartE2EDuration="2m5.237494365s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:10.236220442 +0000 UTC m=+145.867669210" watchObservedRunningTime="2025-11-27 16:06:10.237494365 +0000 UTC m=+145.868943133" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.241576 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cj6jg" podStartSLOduration=125.241559279 podStartE2EDuration="2m5.241559279s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:10.207320664 +0000 UTC m=+145.838769432" watchObservedRunningTime="2025-11-27 16:06:10.241559279 +0000 UTC m=+145.873008057" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.298090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.298673 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.798650607 +0000 UTC m=+146.430099375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.399875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.401550 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:10.901532547 +0000 UTC m=+146.532981305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.473409 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" podStartSLOduration=125.473346782 podStartE2EDuration="2m5.473346782s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:10.457282931 +0000 UTC m=+146.088731709" watchObservedRunningTime="2025-11-27 16:06:10.473346782 +0000 UTC m=+146.104795550" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.505977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.506424 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.006405227 +0000 UTC m=+146.637853995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.558447 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhqwm" podStartSLOduration=125.558428736 podStartE2EDuration="2m5.558428736s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:10.512080342 +0000 UTC m=+146.143529110" watchObservedRunningTime="2025-11-27 16:06:10.558428736 +0000 UTC m=+146.189877504" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.582850 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zdxsk" podStartSLOduration=125.58283572 podStartE2EDuration="2m5.58283572s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:10.58205334 +0000 UTC m=+146.213502098" watchObservedRunningTime="2025-11-27 16:06:10.58283572 +0000 UTC m=+146.214284478" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.584853 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" podStartSLOduration=125.584845451 podStartE2EDuration="2m5.584845451s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:10.562620593 +0000 UTC m=+146.194069371" watchObservedRunningTime="2025-11-27 16:06:10.584845451 +0000 UTC m=+146.216294219" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.607496 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.607977 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.107951171 +0000 UTC m=+146.739400060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.713755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.714466 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.214447063 +0000 UTC m=+146.845895821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.738390 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:10 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:10 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:10 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.738509 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.826156 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.826560 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.326542957 +0000 UTC m=+146.957991725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.927665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" event={"ID":"eb6211ec-62fd-4a7a-b220-87ce050b996e","Type":"ContainerStarted","Data":"176f5035934118f11f77edf91aa67c489c1ccb45273237c49f0ea3b3e5528764"} Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.928827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:10 crc kubenswrapper[4707]: E1127 16:06:10.929325 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.429303253 +0000 UTC m=+147.060752021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.948590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" event={"ID":"b735e79c-0093-4887-b27a-6e333d9d80a5","Type":"ContainerStarted","Data":"343e24871b6c62354e149e485773cb2e7184501725e523d34f69eaccf2153923"} Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.967787 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xbpq4" podStartSLOduration=124.967768346 podStartE2EDuration="2m4.967768346s" podCreationTimestamp="2025-11-27 16:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:10.965991761 +0000 UTC m=+146.597440529" watchObservedRunningTime="2025-11-27 16:06:10.967768346 +0000 UTC m=+146.599217114" Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.968817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" event={"ID":"eab64752-efb6-4d5d-89b2-43da02ae599f","Type":"ContainerStarted","Data":"bb932d0ccf0fefe4aac0129f76e5a01f2f1a37532ecb70d063894395058470fd"} Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.985019 4707 generic.go:334] "Generic (PLEG): container finished" podID="9744e361-2c35-436c-a453-984cff9d923f" containerID="cfd4694aaaa6900a77761344853e2b7f072f6c7a2b89215738377aa777e9d36b" exitCode=0 Nov 27 16:06:10 crc kubenswrapper[4707]: I1127 16:06:10.985438 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-khskd" event={"ID":"9744e361-2c35-436c-a453-984cff9d923f","Type":"ContainerDied","Data":"cfd4694aaaa6900a77761344853e2b7f072f6c7a2b89215738377aa777e9d36b"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.022642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" event={"ID":"ba92c9fa-a7ba-4ae5-87be-4d0ded13bba1","Type":"ContainerStarted","Data":"cdaa91ee7cfede21c46798188706d9d42d1fe2ba8ec1b7603a829bb5cb784d5e"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.032392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.034141 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.534124612 +0000 UTC m=+147.165573380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.036094 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" event={"ID":"9c44f508-25ec-493c-ba35-8c6d4f0cf7ae","Type":"ContainerStarted","Data":"ea9498830279376cce7e4a09af17888faf06fffefe0ebeb29c67188237d4107a"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.054895 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" event={"ID":"e084e8d6-5275-426c-9be4-c5f4ee49abef","Type":"ContainerStarted","Data":"767f90fcc44d819ff96906c179dbf315a883bf176b3fba7ca5e514ae27f93219"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.058591 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" event={"ID":"5e7a06d3-2413-4179-83b7-db23583f1c6d","Type":"ContainerStarted","Data":"a143be15008658bba7c98baf0691267cb61db00c67f11f42410b4773372a5b56"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.068212 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" podStartSLOduration=126.068190493 podStartE2EDuration="2m6.068190493s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.015037394 +0000 UTC m=+146.646486162" watchObservedRunningTime="2025-11-27 16:06:11.068190493 +0000 UTC m=+146.699639261" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.113784 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" podStartSLOduration=126.113767647 podStartE2EDuration="2m6.113767647s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.108224256 +0000 UTC m=+146.739673024" watchObservedRunningTime="2025-11-27 16:06:11.113767647 +0000 UTC m=+146.745216415" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.126767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" event={"ID":"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f","Type":"ContainerStarted","Data":"3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.126811 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" event={"ID":"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f","Type":"ContainerStarted","Data":"b812b66ced3f98fc1ee09340abd229271a2068f3a2dd9a8821ceac0dbbbeb19c"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.127815 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.133698 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.134733 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.634716783 +0000 UTC m=+147.266165551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.165583 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jtqhs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.166157 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" podUID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.236556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.244475 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.744452047 +0000 UTC m=+147.375900815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.253734 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nhz7q" podStartSLOduration=126.253709494 podStartE2EDuration="2m6.253709494s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.176194023 +0000 UTC m=+146.807642791" watchObservedRunningTime="2025-11-27 16:06:11.253709494 +0000 UTC m=+146.885158262" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.256519 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dmdnz" podStartSLOduration=126.256502015 podStartE2EDuration="2m6.256502015s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.237362986 +0000 UTC m=+146.868811754" watchObservedRunningTime="2025-11-27 16:06:11.256502015 +0000 UTC m=+146.887950783" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.276067 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.276142 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" event={"ID":"ecc19675-2c51-4a55-b870-7906093e3de2","Type":"ContainerStarted","Data":"9e5af58c20292492f3ad9d68c8fbbd085137092455b146bcb7c01dcf4ec19606"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.276186 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" event={"ID":"bae1e9aa-742e-4e00-ba4a-2edad6d45e95","Type":"ContainerStarted","Data":"380a4499f23b4019b65fb53202ea818e3cfbc308ff726d2933e7ec8554dd5f38"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.276763 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" event={"ID":"4bd96a64-1d3d-464d-a386-26a39642ee24","Type":"ContainerStarted","Data":"9ed259ba3a8f5519e24fbf0b24a0146a0b3ea597e420694d22a0c7f10780e44c"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.306491 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" event={"ID":"5450b16d-7dca-4ba0-8184-bb0b4a19dc71","Type":"ContainerStarted","Data":"10c3e9bece3a7179a715cb44d14b605564c1b65d7872a22ff67ed4fca462bb3c"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.309992 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" event={"ID":"e8027394-2524-45df-8cdc-967024215d25","Type":"ContainerStarted","Data":"8b7f8695b5bb015a3fdf8551c306086f12fc8b41456352db93468c57de88e380"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.325643 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-khh5f" podStartSLOduration=126.325623161 podStartE2EDuration="2m6.325623161s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.324737559 +0000 UTC m=+146.956186327" watchObservedRunningTime="2025-11-27 16:06:11.325623161 +0000 UTC m=+146.957071919" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.334024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-85rvh" event={"ID":"4fb5c374-18c5-433f-bb2c-f03d8eea8de6","Type":"ContainerStarted","Data":"3a3a0160a763a48897384ba4fbb6976cec763f7220721979fb5f821e1ff0c37a"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.343000 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.343272 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.843224071 +0000 UTC m=+147.474672829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.343695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.344226 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.844218947 +0000 UTC m=+147.475667715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.345582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" event={"ID":"2d3126ab-3db9-4fa7-95ef-673a79b2178a","Type":"ContainerStarted","Data":"51aec28f3be3203cd436ebb307a3d6fb1ea59ba8cf6eed1ce4542cf6e0a4631e"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.345653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" event={"ID":"2d3126ab-3db9-4fa7-95ef-673a79b2178a","Type":"ContainerStarted","Data":"17f711e1230e8914cbc86f84808d7c74c7c6420dbcb9832f841d83eeca29eb1c"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.346175 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.361206 4707 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-49hfs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.361314 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" podUID="2d3126ab-3db9-4fa7-95ef-673a79b2178a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.405683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5r8tf" event={"ID":"36bf60c9-93cb-431f-9df1-1d3e245c49ef","Type":"ContainerStarted","Data":"383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.437951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" event={"ID":"ce9dffbd-e000-476b-b89b-6208d0506f26","Type":"ContainerStarted","Data":"83e88904b9981c5c0a4ff1f1c7a0cdf071ddce7f9866f92b561fdcd2fc5301fd"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.462886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.464276 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:11.964246144 +0000 UTC m=+147.595694912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.517174 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" event={"ID":"5f4261e0-d038-4c38-8a9f-27c22bb95d1c","Type":"ContainerStarted","Data":"fea9f973c6405c3135c1120acf3e649a396b108685e2323ec33ba7a49dd79654"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.551627 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" event={"ID":"efcebfb2-d130-4f47-a514-01d1bf5eb567","Type":"ContainerStarted","Data":"b324bd061fc7010c1673378360bc7dc25200fd2c53c9b2cd941e457f36d96f41"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.551679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" event={"ID":"efcebfb2-d130-4f47-a514-01d1bf5eb567","Type":"ContainerStarted","Data":"f97eff0eaa8559a42a9b3352ffaf0c834353e68998796a01933b9d7b995a119b"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.552694 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.569690 4707 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-h4m82 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.569747 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" podUID="efcebfb2-d130-4f47-a514-01d1bf5eb567" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.571033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.571328 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.071317769 +0000 UTC m=+147.702766537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.590070 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4sj7l" podStartSLOduration=126.590054328 podStartE2EDuration="2m6.590054328s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.422969129 +0000 UTC m=+147.054417897" watchObservedRunningTime="2025-11-27 16:06:11.590054328 +0000 UTC m=+147.221503096" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.592831 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" event={"ID":"f55dc6de-bb5d-4221-a670-4b65c3992031","Type":"ContainerStarted","Data":"1fd6d1666a2a28e854197a69e8ff0663d5e84dfc1816406495ad454e11a77296"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.594934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zgfxc" event={"ID":"33d637be-496f-43c5-bb76-b742f8cc97ac","Type":"ContainerStarted","Data":"f3a01ff71a5d6341a6beb283f55b36f1bc4645b2e4d391d3d7ae20378dabf3f1"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.621628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" event={"ID":"789119b0-3180-49e9-8d16-c60f968bf6cf","Type":"ContainerStarted","Data":"da979fb0efa10d408837c3fd2b811f98eade50b5f9f8cbcd7743f94e4b3a1677"} Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.630276 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-zdxsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.630649 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zdxsk" podUID="50e5ce2e-3776-4890-8b73-b0ae2b2d0237" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.649258 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.657265 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h746w" podStartSLOduration=126.657249775 podStartE2EDuration="2m6.657249775s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.656069065 +0000 UTC m=+147.287517833" watchObservedRunningTime="2025-11-27 16:06:11.657249775 +0000 UTC m=+147.288698543" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.658352 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" podStartSLOduration=126.658347013 podStartE2EDuration="2m6.658347013s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.591635898 +0000 UTC m=+147.223084666" watchObservedRunningTime="2025-11-27 16:06:11.658347013 +0000 UTC m=+147.289795781" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.669792 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-6m8wz" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.672157 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.674682 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.17465954 +0000 UTC m=+147.806108308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.750552 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:11 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:11 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:11 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.750601 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.786082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.786346 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.286332683 +0000 UTC m=+147.917781451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.790888 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.826320 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" podStartSLOduration=126.826301055 podStartE2EDuration="2m6.826301055s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.75567769 +0000 UTC m=+147.387126458" watchObservedRunningTime="2025-11-27 16:06:11.826301055 +0000 UTC m=+147.457749823" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.860189 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" podStartSLOduration=126.8601693 podStartE2EDuration="2m6.8601693s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.829407034 +0000 UTC m=+147.460855802" watchObservedRunningTime="2025-11-27 16:06:11.8601693 +0000 UTC m=+147.491618068" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.862420 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-85rvh" podStartSLOduration=7.862413848 podStartE2EDuration="7.862413848s" podCreationTimestamp="2025-11-27 16:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.861749121 +0000 UTC m=+147.493197889" watchObservedRunningTime="2025-11-27 16:06:11.862413848 +0000 UTC m=+147.493862616" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.891673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.892811 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.392780664 +0000 UTC m=+148.024229432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.899303 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" podStartSLOduration=125.89927336 podStartE2EDuration="2m5.89927336s" podCreationTimestamp="2025-11-27 16:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:11.893631615 +0000 UTC m=+147.525080373" watchObservedRunningTime="2025-11-27 16:06:11.89927336 +0000 UTC m=+147.530722128" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.955644 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.955717 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:11 crc kubenswrapper[4707]: I1127 16:06:11.995937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:11 crc kubenswrapper[4707]: E1127 16:06:11.996911 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.496891494 +0000 UTC m=+148.128340262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.041351 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zgfxc" podStartSLOduration=7.04132905 podStartE2EDuration="7.04132905s" podCreationTimestamp="2025-11-27 16:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:12.033864759 +0000 UTC m=+147.665313527" watchObservedRunningTime="2025-11-27 16:06:12.04132905 +0000 UTC m=+147.672777818" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.074705 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" podStartSLOduration=127.074688172 podStartE2EDuration="2m7.074688172s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:12.07265842 +0000 UTC m=+147.704107188" watchObservedRunningTime="2025-11-27 16:06:12.074688172 +0000 UTC m=+147.706136940" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.098485 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.099043 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.599006414 +0000 UTC m=+148.230455172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.125832 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" podStartSLOduration=127.125803868 podStartE2EDuration="2m7.125803868s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:12.112781996 +0000 UTC m=+147.744230754" watchObservedRunningTime="2025-11-27 16:06:12.125803868 +0000 UTC m=+147.757252636" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.202703 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.203093 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.703078663 +0000 UTC m=+148.334527431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.220017 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" podStartSLOduration=127.219992195 podStartE2EDuration="2m7.219992195s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:12.163896432 +0000 UTC m=+147.795345200" watchObservedRunningTime="2025-11-27 16:06:12.219992195 +0000 UTC m=+147.851440963" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.300094 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" podStartSLOduration=126.300069932 podStartE2EDuration="2m6.300069932s" podCreationTimestamp="2025-11-27 16:04:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:12.293286678 +0000 UTC m=+147.924735446" watchObservedRunningTime="2025-11-27 16:06:12.300069932 +0000 UTC m=+147.931518720" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.306101 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.306536 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.806515746 +0000 UTC m=+148.437964514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.306608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.306969 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.806962718 +0000 UTC m=+148.438411486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.310510 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dnk6z"] Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.311699 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.320881 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dnk6z"] Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.321709 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.345914 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5r8tf" podStartSLOduration=127.345872712 podStartE2EDuration="2m7.345872712s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:12.335464406 +0000 UTC m=+147.966913164" watchObservedRunningTime="2025-11-27 16:06:12.345872712 +0000 UTC m=+147.977321480" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.374155 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" podStartSLOduration=127.374130444 podStartE2EDuration="2m7.374130444s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:12.372738939 +0000 UTC m=+148.004187707" watchObservedRunningTime="2025-11-27 16:06:12.374130444 +0000 UTC m=+148.005579212" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.409930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.410155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vm85\" (UniqueName: \"kubernetes.io/projected/bafb40a6-a701-4082-a791-65fce73e8669-kube-api-access-5vm85\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.410180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-catalog-content\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.410243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-utilities\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.410415 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:12.910394011 +0000 UTC m=+148.541842779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.440751 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d97n2"] Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.441883 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.446569 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.472155 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d97n2"] Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.516319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vm85\" (UniqueName: \"kubernetes.io/projected/bafb40a6-a701-4082-a791-65fce73e8669-kube-api-access-5vm85\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.516366 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-catalog-content\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.516458 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-utilities\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.516484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-catalog-content\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.516507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-utilities\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.516536 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.516563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx5bp\" (UniqueName: \"kubernetes.io/projected/f0d7e53b-3b6f-469d-b550-e61dbea7724f-kube-api-access-wx5bp\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.517238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-catalog-content\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.517496 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-utilities\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.517723 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.017712943 +0000 UTC m=+148.649161711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.553401 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vm85\" (UniqueName: \"kubernetes.io/projected/bafb40a6-a701-4082-a791-65fce73e8669-kube-api-access-5vm85\") pod \"community-operators-dnk6z\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.618042 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.618425 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.118390716 +0000 UTC m=+148.749839474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.618644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-catalog-content\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.618722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.618776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx5bp\" (UniqueName: \"kubernetes.io/projected/f0d7e53b-3b6f-469d-b550-e61dbea7724f-kube-api-access-wx5bp\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.618899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-utilities\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.619406 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.119397112 +0000 UTC m=+148.750845880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.620169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-catalog-content\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.620458 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-utilities\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.633859 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nvkng"] Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.634919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.648525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.680434 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx5bp\" (UniqueName: \"kubernetes.io/projected/f0d7e53b-3b6f-469d-b550-e61dbea7724f-kube-api-access-wx5bp\") pod \"certified-operators-d97n2\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.689578 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.720126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.720354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-catalog-content\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.720413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-utilities\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.720443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l85hg\" (UniqueName: \"kubernetes.io/projected/2832c567-f82e-487a-8e20-f5d5b10168f3-kube-api-access-l85hg\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.720509 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.220482645 +0000 UTC m=+148.851931413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.721275 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6t7cw" event={"ID":"5f4261e0-d038-4c38-8a9f-27c22bb95d1c","Type":"ContainerStarted","Data":"1e1393db020e9586390e09f238dfee23e92210c2237635c915c1a7bec049c4d3"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.741561 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:12 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:12 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:12 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.741619 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.752680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" event={"ID":"7af50556-1505-45f2-b080-f1484a42f2cd","Type":"ContainerStarted","Data":"4450a4ba34015208757df0d7587150bdd11a4bdaa4dc3c81faf840c17362b129"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.761298 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvkng"] Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.772985 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.777059 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pfqjd" event={"ID":"4bd96a64-1d3d-464d-a386-26a39642ee24","Type":"ContainerStarted","Data":"63c11abc6560dcd3a3df232c1d77948f3ab15bdee57faa57e7132b32fa1e66f8"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.804637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" event={"ID":"8a947df3-154e-488f-9e1c-4a41cc94553c","Type":"ContainerStarted","Data":"c79c7c4e53d93f0d6553fc0c1649464584235049dd63b4f72aef3213db89d75c"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.805305 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.823890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.823938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-catalog-content\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.823976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-utilities\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.823997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l85hg\" (UniqueName: \"kubernetes.io/projected/2832c567-f82e-487a-8e20-f5d5b10168f3-kube-api-access-l85hg\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.824932 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.324920594 +0000 UTC m=+148.956369362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.825681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-catalog-content\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.826005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-utilities\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.859702 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dlchm" event={"ID":"e084e8d6-5275-426c-9be4-c5f4ee49abef","Type":"ContainerStarted","Data":"f38fc9c1e47a93bc0349e55b934d6adc68d33de6ec58a4e0684861e1e0381ae1"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.882572 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zg2ss" event={"ID":"90d768c2-2c32-41a1-b661-763e7c027a94","Type":"ContainerStarted","Data":"09d13d18c3b1bf13387c66539d0853824acd01b2da6d34142c72ae59338fbae0"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.882623 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zg2ss" event={"ID":"90d768c2-2c32-41a1-b661-763e7c027a94","Type":"ContainerStarted","Data":"89353986dc5dc31d22eaa5c5455c917f3f17c8892606dc818c55235fa18b3ceb"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.883395 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.906713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l85hg\" (UniqueName: \"kubernetes.io/projected/2832c567-f82e-487a-8e20-f5d5b10168f3-kube-api-access-l85hg\") pod \"community-operators-nvkng\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.906902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" event={"ID":"f55dc6de-bb5d-4221-a670-4b65c3992031","Type":"ContainerStarted","Data":"6f9e1859fb3d4cc72392a9151b239b734bac5c17488cfb399958b7f1f94a7520"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.929058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.945568 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7k4zd" podStartSLOduration=127.945548897 podStartE2EDuration="2m7.945548897s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:12.921560284 +0000 UTC m=+148.553009052" watchObservedRunningTime="2025-11-27 16:06:12.945548897 +0000 UTC m=+148.576997665" Nov 27 16:06:12 crc kubenswrapper[4707]: E1127 16:06:12.946673 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.446648665 +0000 UTC m=+149.078097433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.949981 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-57fvk"] Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.956105 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.963137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" event={"ID":"5450b16d-7dca-4ba0-8184-bb0b4a19dc71","Type":"ContainerStarted","Data":"7b64c8bc8737a62d1dd5c4fac33a14fbd4fc124019c39aac68a2b8e99cbd76d7"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.963176 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" event={"ID":"5450b16d-7dca-4ba0-8184-bb0b4a19dc71","Type":"ContainerStarted","Data":"e01915a51dfcfb9ea0f3b93364203a40101716dc9b74ae3827e8f8b51bc01d19"} Nov 27 16:06:12 crc kubenswrapper[4707]: I1127 16:06:12.963289 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.015188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" event={"ID":"eab64752-efb6-4d5d-89b2-43da02ae599f","Type":"ContainerStarted","Data":"e22bd594548586f9f65b3fabc5a487637d702fb8373749dd7ca93630ac8868ef"} Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.015733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" event={"ID":"eab64752-efb6-4d5d-89b2-43da02ae599f","Type":"ContainerStarted","Data":"c7c2d6c1ce6cdcd4bd1d6d41584bf7faf16143d9271a79834b0ac9240c47adca"} Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.031279 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-57fvk"] Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.032343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-catalog-content\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.032620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-utilities\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.032690 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9g5w\" (UniqueName: \"kubernetes.io/projected/10e9e536-d557-44af-ae4d-7472fc20ee37-kube-api-access-s9g5w\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.032734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.034698 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.534685264 +0000 UTC m=+149.166134032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.086495 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" podStartSLOduration=128.086471138 podStartE2EDuration="2m8.086471138s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:13.047747008 +0000 UTC m=+148.679195776" watchObservedRunningTime="2025-11-27 16:06:13.086471138 +0000 UTC m=+148.717919906" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.104294 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zg2ss" podStartSLOduration=8.104275073 podStartE2EDuration="8.104275073s" podCreationTimestamp="2025-11-27 16:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:13.103435501 +0000 UTC m=+148.734884269" watchObservedRunningTime="2025-11-27 16:06:13.104275073 +0000 UTC m=+148.735723841" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.112879 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-khskd" event={"ID":"9744e361-2c35-436c-a453-984cff9d923f","Type":"ContainerStarted","Data":"fe909c8176356d75a21b3c99394b9b478808633f97cb4b0ec194b862c3bd3d5d"} Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.112922 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-khskd" event={"ID":"9744e361-2c35-436c-a453-984cff9d923f","Type":"ContainerStarted","Data":"3a6e88caa1151d8e584757b49ed0a85f71e8051ca1138289e4df0e93a3a7f551"} Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.145396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.145735 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.145771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.145795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-catalog-content\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.145817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.145883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-utilities\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.145902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9g5w\" (UniqueName: \"kubernetes.io/projected/10e9e536-d557-44af-ae4d-7472fc20ee37-kube-api-access-s9g5w\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.145921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.146557 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.646525772 +0000 UTC m=+149.277974540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.146905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-catalog-content\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.148082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-74mzf" event={"ID":"789119b0-3180-49e9-8d16-c60f968bf6cf","Type":"ContainerStarted","Data":"9aa06ba7a7591f973e615d1470a57460665aaabdad3cab1bf4b352353c1a8ed0"} Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.149183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-utilities\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.150819 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.157523 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.160099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.164364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.176533 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-slj9l" event={"ID":"24bf08cf-63a0-47ca-be2a-0f38a51109c9","Type":"ContainerStarted","Data":"e9660435669ed9cf09b7a1a07b960e44d6ed555557b641d35fc9b38cb3b4486b"} Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.178087 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d8nfn" event={"ID":"ce9dffbd-e000-476b-b89b-6208d0506f26","Type":"ContainerStarted","Data":"ededc5983d81ad089beeb4854acc05cac34f5a4b876b5052c0f1eb559b750bda"} Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.185521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9g5w\" (UniqueName: \"kubernetes.io/projected/10e9e536-d557-44af-ae4d-7472fc20ee37-kube-api-access-s9g5w\") pod \"certified-operators-57fvk\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.186396 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" event={"ID":"1a3a3427-44f4-4518-861e-f11f5cb76d90","Type":"ContainerStarted","Data":"7334103362e1aee9be40b085e100a2152014009db79b845e2578e54be1ecc1e2"} Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.190596 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jtqhs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.190635 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" podUID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.213794 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.227714 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.231865 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hg2x5" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.231920 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-49hfs" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.235473 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.252885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.253189 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.753177028 +0000 UTC m=+149.384625786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.285459 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lrj4r" podStartSLOduration=128.285433212 podStartE2EDuration="2m8.285433212s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:13.211823251 +0000 UTC m=+148.843272039" watchObservedRunningTime="2025-11-27 16:06:13.285433212 +0000 UTC m=+148.916881980" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.331362 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-72lr4" podStartSLOduration=128.331346455 podStartE2EDuration="2m8.331346455s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:13.31313014 +0000 UTC m=+148.944578918" watchObservedRunningTime="2025-11-27 16:06:13.331346455 +0000 UTC m=+148.962795223" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.356058 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.357982 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.381964 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ghzc4" podStartSLOduration=128.381941028 podStartE2EDuration="2m8.381941028s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:13.380214884 +0000 UTC m=+149.011663652" watchObservedRunningTime="2025-11-27 16:06:13.381941028 +0000 UTC m=+149.013389796" Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.385060 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.885032877 +0000 UTC m=+149.516481645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.461094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.461461 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:13.96144715 +0000 UTC m=+149.592895918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.530411 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-khskd" podStartSLOduration=128.530393322 podStartE2EDuration="2m8.530393322s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:13.528436712 +0000 UTC m=+149.159885490" watchObservedRunningTime="2025-11-27 16:06:13.530393322 +0000 UTC m=+149.161842090" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.563460 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.563938 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.063915869 +0000 UTC m=+149.695364627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.666224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.666596 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.166581762 +0000 UTC m=+149.798030520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.736540 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:13 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:13 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:13 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.736645 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.770163 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.770661 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.270630531 +0000 UTC m=+149.902079299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.770831 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.771195 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.271181335 +0000 UTC m=+149.902630103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.874171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.874593 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.374571947 +0000 UTC m=+150.006020715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:13 crc kubenswrapper[4707]: I1127 16:06:13.976287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:13 crc kubenswrapper[4707]: E1127 16:06:13.978220 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.478203015 +0000 UTC m=+150.109651783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.042513 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dnk6z"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.076477 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d97n2"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.077827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.078234 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.578214561 +0000 UTC m=+150.209663329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.089225 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nvkng"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.115553 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.116267 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.126586 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.126889 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.180587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4599ff2b-0390-49da-a99e-f003afa58144-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4599ff2b-0390-49da-a99e-f003afa58144\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.180646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4599ff2b-0390-49da-a99e-f003afa58144-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4599ff2b-0390-49da-a99e-f003afa58144\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.180694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.181022 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.681008868 +0000 UTC m=+150.312457636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.187009 4707 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-h4m82 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.187079 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" podUID="efcebfb2-d130-4f47-a514-01d1bf5eb567" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.188466 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.203728 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jtqhs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.203770 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" podUID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.290710 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.306694 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.806670389 +0000 UTC m=+150.438119157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.308641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4599ff2b-0390-49da-a99e-f003afa58144-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4599ff2b-0390-49da-a99e-f003afa58144\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.308899 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4599ff2b-0390-49da-a99e-f003afa58144-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4599ff2b-0390-49da-a99e-f003afa58144\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.309232 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.331045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4599ff2b-0390-49da-a99e-f003afa58144-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4599ff2b-0390-49da-a99e-f003afa58144\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.332448 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.832434628 +0000 UTC m=+150.463883396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.394561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4599ff2b-0390-49da-a99e-f003afa58144-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4599ff2b-0390-49da-a99e-f003afa58144\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.438669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.439117 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.939082863 +0000 UTC m=+150.570531631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.439561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.439892 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:14.939885234 +0000 UTC m=+150.571334002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.499951 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xhshk"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.501280 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.514702 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.535798 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhshk"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.543093 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.543525 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.043508481 +0000 UTC m=+150.674957249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.544119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.658855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-catalog-content\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.658924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-utilities\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.658953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.658982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv7fb\" (UniqueName: \"kubernetes.io/projected/2b313184-19c4-42e4-b488-63f8e894feea-kube-api-access-vv7fb\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.659436 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.159415963 +0000 UTC m=+150.790864731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.748503 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:14 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:14 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:14 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.748861 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.770821 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.771013 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-utilities\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.771062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv7fb\" (UniqueName: \"kubernetes.io/projected/2b313184-19c4-42e4-b488-63f8e894feea-kube-api-access-vv7fb\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.771143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-catalog-content\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.771607 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-catalog-content\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.771917 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.271900738 +0000 UTC m=+150.903349506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.789076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-utilities\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.828173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv7fb\" (UniqueName: \"kubernetes.io/projected/2b313184-19c4-42e4-b488-63f8e894feea-kube-api-access-vv7fb\") pod \"redhat-marketplace-xhshk\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.842406 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-57fvk"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.857929 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vfv9n"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.860229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.872278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.872788 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.372774186 +0000 UTC m=+151.004222944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.902108 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfv9n"] Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.912247 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.980597 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.981019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7q62\" (UniqueName: \"kubernetes.io/projected/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-kube-api-access-r7q62\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.981072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-utilities\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:14 crc kubenswrapper[4707]: I1127 16:06:14.981092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-catalog-content\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:14 crc kubenswrapper[4707]: E1127 16:06:14.981190 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.481174246 +0000 UTC m=+151.112623014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.082494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.083092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7q62\" (UniqueName: \"kubernetes.io/projected/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-kube-api-access-r7q62\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.083147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-utilities\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.083171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-catalog-content\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.083732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-catalog-content\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.084106 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.584090065 +0000 UTC m=+151.215538833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.084853 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-utilities\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.112201 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7q62\" (UniqueName: \"kubernetes.io/projected/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-kube-api-access-r7q62\") pod \"redhat-marketplace-vfv9n\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.128972 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jqntr" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.184400 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.184860 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.684841509 +0000 UTC m=+151.316290277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.205209 4707 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-h4m82 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.205295 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" podUID="efcebfb2-d130-4f47-a514-01d1bf5eb567" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.286423 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.286963 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.786939178 +0000 UTC m=+151.418387946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.291574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4d369a1b4c2780d77d881dac74c3f7f70e97ef504c3a944c3575902d96312d06"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.293537 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.311016 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-slj9l" event={"ID":"24bf08cf-63a0-47ca-be2a-0f38a51109c9","Type":"ContainerStarted","Data":"0d4f8882752597c1a06f44b8f5d231241585e292625d619499f5f15b35d2e2ca"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.352607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57fvk" event={"ID":"10e9e536-d557-44af-ae4d-7472fc20ee37","Type":"ContainerStarted","Data":"fec1dedbb4165174711d46771f90496c2facef1ca99b15743760606f71d921c6"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.364159 4707 generic.go:334] "Generic (PLEG): container finished" podID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerID="18f05b2cc342f9dde250ff35d25badbed221136876b8c3e913b681fa5db70a55" exitCode=0 Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.364242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkng" event={"ID":"2832c567-f82e-487a-8e20-f5d5b10168f3","Type":"ContainerDied","Data":"18f05b2cc342f9dde250ff35d25badbed221136876b8c3e913b681fa5db70a55"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.364271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkng" event={"ID":"2832c567-f82e-487a-8e20-f5d5b10168f3","Type":"ContainerStarted","Data":"a12f448491630f1a08ed692225a7a9d2f243be97286502ae677ca3b99a8d0b8e"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.374558 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.392044 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.392761 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.892743282 +0000 UTC m=+151.524192050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.406539 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.410568 4707 generic.go:334] "Generic (PLEG): container finished" podID="bafb40a6-a701-4082-a791-65fce73e8669" containerID="ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21" exitCode=0 Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.410667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dnk6z" event={"ID":"bafb40a6-a701-4082-a791-65fce73e8669","Type":"ContainerDied","Data":"ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.410752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dnk6z" event={"ID":"bafb40a6-a701-4082-a791-65fce73e8669","Type":"ContainerStarted","Data":"d128c0ff7399430c722aa19967dfb6b9eb9c9f02bb5858723944cea2a5cee014"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.450709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ec03ecaa48bb8f6785a992c96059edd01fcb74a4e396939caecd48c54e3227db"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.459119 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s5nbn"] Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.461883 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.474218 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.477711 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerID="29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829" exitCode=0 Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.477793 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d97n2" event={"ID":"f0d7e53b-3b6f-469d-b550-e61dbea7724f","Type":"ContainerDied","Data":"29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.477829 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d97n2" event={"ID":"f0d7e53b-3b6f-469d-b550-e61dbea7724f","Type":"ContainerStarted","Data":"bcbc2755d8fde736ca79c664d9ad52162b326fb62c35335e122e25fd0c738771"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.494081 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s5nbn"] Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.494418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.497115 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:15.997090029 +0000 UTC m=+151.628538797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.528871 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ccad8535b1100fc7f2dcb13b346d5cabd43f332ce0269506976e5beec65b520a"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.557905 4707 generic.go:334] "Generic (PLEG): container finished" podID="f55dc6de-bb5d-4221-a670-4b65c3992031" containerID="6f9e1859fb3d4cc72392a9151b239b734bac5c17488cfb399958b7f1f94a7520" exitCode=0 Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.559268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" event={"ID":"f55dc6de-bb5d-4221-a670-4b65c3992031","Type":"ContainerDied","Data":"6f9e1859fb3d4cc72392a9151b239b734bac5c17488cfb399958b7f1f94a7520"} Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.596905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.597163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-catalog-content\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.597225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lz5k\" (UniqueName: \"kubernetes.io/projected/fb266fde-0b2c-4866-8274-9ed2d4821c14-kube-api-access-6lz5k\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.597275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-utilities\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.600239 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.100201444 +0000 UTC m=+151.731650202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.703317 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-catalog-content\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.706738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.706849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lz5k\" (UniqueName: \"kubernetes.io/projected/fb266fde-0b2c-4866-8274-9ed2d4821c14-kube-api-access-6lz5k\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.707091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-utilities\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.708982 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.208923732 +0000 UTC m=+151.840372500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.709646 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-catalog-content\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.711238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-utilities\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.716957 4707 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.732017 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:15 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:15 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:15 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.732072 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.732648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lz5k\" (UniqueName: \"kubernetes.io/projected/fb266fde-0b2c-4866-8274-9ed2d4821c14-kube-api-access-6lz5k\") pod \"redhat-operators-s5nbn\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.808933 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.809274 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.309256006 +0000 UTC m=+151.940704774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.829464 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xr65n"] Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.830560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.846722 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfv9n"] Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.860550 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xr65n"] Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.866166 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhshk"] Nov 27 16:06:15 crc kubenswrapper[4707]: W1127 16:06:15.878672 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea8aba87_c190_4442_bd3f_5d2d9e2f38e4.slice/crio-e516ea2d56d317b04a415c88d372b50864ee3bf7dd5283a317ad930d4465f053 WatchSource:0}: Error finding container e516ea2d56d317b04a415c88d372b50864ee3bf7dd5283a317ad930d4465f053: Status 404 returned error can't find the container with id e516ea2d56d317b04a415c88d372b50864ee3bf7dd5283a317ad930d4465f053 Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.887113 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:06:15 crc kubenswrapper[4707]: I1127 16:06:15.911165 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:15 crc kubenswrapper[4707]: E1127 16:06:15.911530 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.411509609 +0000 UTC m=+152.042958377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.012763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.012971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-catalog-content\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.013005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-utilities\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.013030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wgst\" (UniqueName: \"kubernetes.io/projected/22f274d1-48a8-4b2e-a80e-4163a9fa069e-kube-api-access-8wgst\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: E1127 16:06:16.013464 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.513201458 +0000 UTC m=+152.144650226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.113972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-catalog-content\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.114019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-utilities\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.114040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wgst\" (UniqueName: \"kubernetes.io/projected/22f274d1-48a8-4b2e-a80e-4163a9fa069e-kube-api-access-8wgst\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.114097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:16 crc kubenswrapper[4707]: E1127 16:06:16.114421 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.614406974 +0000 UTC m=+152.245855742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.115737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-utilities\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.116515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-catalog-content\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.140794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wgst\" (UniqueName: \"kubernetes.io/projected/22f274d1-48a8-4b2e-a80e-4163a9fa069e-kube-api-access-8wgst\") pod \"redhat-operators-xr65n\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.156939 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s5nbn"] Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.162878 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:06:16 crc kubenswrapper[4707]: W1127 16:06:16.189286 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb266fde_0b2c_4866_8274_9ed2d4821c14.slice/crio-cc1ed1694a4d759da6aff6e636ec0123a60ecae0aef537d69b79140010f5067e WatchSource:0}: Error finding container cc1ed1694a4d759da6aff6e636ec0123a60ecae0aef537d69b79140010f5067e: Status 404 returned error can't find the container with id cc1ed1694a4d759da6aff6e636ec0123a60ecae0aef537d69b79140010f5067e Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.214936 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:16 crc kubenswrapper[4707]: E1127 16:06:16.215433 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.715398535 +0000 UTC m=+152.346847303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.215595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:16 crc kubenswrapper[4707]: E1127 16:06:16.215997 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.71598219 +0000 UTC m=+152.347430958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwvst" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.317235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:16 crc kubenswrapper[4707]: E1127 16:06:16.317794 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 16:06:16.817775531 +0000 UTC m=+152.449224299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.395078 4707 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-27T16:06:15.716979278Z","Handler":null,"Name":""} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.404581 4707 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.404642 4707 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.418923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.426149 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.426195 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.451760 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xr65n"] Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.475438 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwvst\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:16 crc kubenswrapper[4707]: E1127 16:06:16.515706 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb266fde_0b2c_4866_8274_9ed2d4821c14.slice/crio-conmon-5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb266fde_0b2c_4866_8274_9ed2d4821c14.slice/crio-5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de.scope\": RecentStats: unable to find data in memory cache]" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.520348 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.525695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.568659 4707 generic.go:334] "Generic (PLEG): container finished" podID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerID="e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26" exitCode=0 Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.568736 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57fvk" event={"ID":"10e9e536-d557-44af-ae4d-7472fc20ee37","Type":"ContainerDied","Data":"e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.573466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4599ff2b-0390-49da-a99e-f003afa58144","Type":"ContainerStarted","Data":"59ca17cf8112738dcb47218350c82d4a72440be39093be2b9f4e9e894fde74fb"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.573496 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4599ff2b-0390-49da-a99e-f003afa58144","Type":"ContainerStarted","Data":"42e6309af34514ee9660ef4c0918dc9e37a8d2e69d84cd7e93de747c480adc39"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.576478 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerID="5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de" exitCode=0 Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.576526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nbn" event={"ID":"fb266fde-0b2c-4866-8274-9ed2d4821c14","Type":"ContainerDied","Data":"5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.576701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nbn" event={"ID":"fb266fde-0b2c-4866-8274-9ed2d4821c14","Type":"ContainerStarted","Data":"cc1ed1694a4d759da6aff6e636ec0123a60ecae0aef537d69b79140010f5067e"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.579810 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ca6bbe343bbc9484af28920cdcd1ce0c19facc8ad74b3eba628d576338d5b9cc"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.582009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"805b2c507da2ce5191ca5f2bc06790c79910db55da8b2aa2048aad536afbc834"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.582519 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.589063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr65n" event={"ID":"22f274d1-48a8-4b2e-a80e-4163a9fa069e","Type":"ContainerStarted","Data":"9d251cb0b8a97801b6ac3f22df4299e8beef2b6e5d29a13693c813f43fa771e0"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.604133 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b313184-19c4-42e4-b488-63f8e894feea" containerID="5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252" exitCode=0 Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.604607 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhshk" event={"ID":"2b313184-19c4-42e4-b488-63f8e894feea","Type":"ContainerDied","Data":"5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.604780 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhshk" event={"ID":"2b313184-19c4-42e4-b488-63f8e894feea","Type":"ContainerStarted","Data":"fd1fa894e27a86ea400eeb7d951d7813dc087f76875acbcc9ce70c5d58d2a1b5"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.621130 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"43ea0a2a97ebfb4aa3a231afb8a48970e2159956305fb00b8c2a2d7c1578d4e4"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.623211 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.623189816 podStartE2EDuration="2.623189816s" podCreationTimestamp="2025-11-27 16:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:16.613757735 +0000 UTC m=+152.245206503" watchObservedRunningTime="2025-11-27 16:06:16.623189816 +0000 UTC m=+152.254638574" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.623510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfv9n" event={"ID":"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4","Type":"ContainerDied","Data":"933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.624356 4707 generic.go:334] "Generic (PLEG): container finished" podID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerID="933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a" exitCode=0 Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.624681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfv9n" event={"ID":"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4","Type":"ContainerStarted","Data":"e516ea2d56d317b04a415c88d372b50864ee3bf7dd5283a317ad930d4465f053"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.638088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-slj9l" event={"ID":"24bf08cf-63a0-47ca-be2a-0f38a51109c9","Type":"ContainerStarted","Data":"b737f157adba044f2528b841202dd40f8b82c016087e03545c101d77c772a08d"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.638123 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-slj9l" event={"ID":"24bf08cf-63a0-47ca-be2a-0f38a51109c9","Type":"ContainerStarted","Data":"a01a0b0ca4dfb606f849022e9249ff97727501534cb1e8ac6d003a9ee77b0786"} Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.718239 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-slj9l" podStartSLOduration=12.718222224 podStartE2EDuration="12.718222224s" podCreationTimestamp="2025-11-27 16:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:16.71415443 +0000 UTC m=+152.345603198" watchObservedRunningTime="2025-11-27 16:06:16.718222224 +0000 UTC m=+152.349670992" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.719300 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.728094 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:16 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:16 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:16 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.728143 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:16 crc kubenswrapper[4707]: I1127 16:06:16.969017 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.057423 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwvst"] Nov 27 16:06:17 crc kubenswrapper[4707]: W1127 16:06:17.075511 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e0fd66_2bb6_4b8a_aaf0_e24490a2f1a0.slice/crio-1189d4d97de40294d3fba3e19f4890b28617a7ead30a5f7bfe5052d0ddb9c0d5 WatchSource:0}: Error finding container 1189d4d97de40294d3fba3e19f4890b28617a7ead30a5f7bfe5052d0ddb9c0d5: Status 404 returned error can't find the container with id 1189d4d97de40294d3fba3e19f4890b28617a7ead30a5f7bfe5052d0ddb9c0d5 Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.132754 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f55dc6de-bb5d-4221-a670-4b65c3992031-secret-volume\") pod \"f55dc6de-bb5d-4221-a670-4b65c3992031\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.132829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4z89\" (UniqueName: \"kubernetes.io/projected/f55dc6de-bb5d-4221-a670-4b65c3992031-kube-api-access-f4z89\") pod \"f55dc6de-bb5d-4221-a670-4b65c3992031\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.132942 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f55dc6de-bb5d-4221-a670-4b65c3992031-config-volume\") pod \"f55dc6de-bb5d-4221-a670-4b65c3992031\" (UID: \"f55dc6de-bb5d-4221-a670-4b65c3992031\") " Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.133732 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55dc6de-bb5d-4221-a670-4b65c3992031-config-volume" (OuterVolumeSpecName: "config-volume") pod "f55dc6de-bb5d-4221-a670-4b65c3992031" (UID: "f55dc6de-bb5d-4221-a670-4b65c3992031"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.145015 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-zdxsk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.145422 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zdxsk" podUID="50e5ce2e-3776-4890-8b73-b0ae2b2d0237" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.145276 4707 patch_prober.go:28] interesting pod/downloads-7954f5f757-zdxsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.145506 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55dc6de-bb5d-4221-a670-4b65c3992031-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f55dc6de-bb5d-4221-a670-4b65c3992031" (UID: "f55dc6de-bb5d-4221-a670-4b65c3992031"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.145556 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zdxsk" podUID="50e5ce2e-3776-4890-8b73-b0ae2b2d0237" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.146128 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55dc6de-bb5d-4221-a670-4b65c3992031-kube-api-access-f4z89" (OuterVolumeSpecName: "kube-api-access-f4z89") pod "f55dc6de-bb5d-4221-a670-4b65c3992031" (UID: "f55dc6de-bb5d-4221-a670-4b65c3992031"). InnerVolumeSpecName "kube-api-access-f4z89". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.210204 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.235306 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f55dc6de-bb5d-4221-a670-4b65c3992031-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.235354 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4z89\" (UniqueName: \"kubernetes.io/projected/f55dc6de-bb5d-4221-a670-4b65c3992031-kube-api-access-f4z89\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.235385 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f55dc6de-bb5d-4221-a670-4b65c3992031-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.512338 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.513637 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.530680 4707 patch_prober.go:28] interesting pod/apiserver-76f77b778f-khskd container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]log ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]etcd ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/generic-apiserver-start-informers ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/max-in-flight-filter ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 27 16:06:17 crc kubenswrapper[4707]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/project.openshift.io-projectcache ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-startinformers ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 27 16:06:17 crc kubenswrapper[4707]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 27 16:06:17 crc kubenswrapper[4707]: livez check failed Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.530813 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-khskd" podUID="9744e361-2c35-436c-a453-984cff9d923f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.651789 4707 generic.go:334] "Generic (PLEG): container finished" podID="4599ff2b-0390-49da-a99e-f003afa58144" containerID="59ca17cf8112738dcb47218350c82d4a72440be39093be2b9f4e9e894fde74fb" exitCode=0 Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.651950 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4599ff2b-0390-49da-a99e-f003afa58144","Type":"ContainerDied","Data":"59ca17cf8112738dcb47218350c82d4a72440be39093be2b9f4e9e894fde74fb"} Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.659492 4707 generic.go:334] "Generic (PLEG): container finished" podID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerID="d6d3f96814f614e7cb456adc4d7c795305cd729a46243ea889046772bb72d413" exitCode=0 Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.659628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr65n" event={"ID":"22f274d1-48a8-4b2e-a80e-4163a9fa069e","Type":"ContainerDied","Data":"d6d3f96814f614e7cb456adc4d7c795305cd729a46243ea889046772bb72d413"} Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.662800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" event={"ID":"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0","Type":"ContainerStarted","Data":"a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50"} Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.662830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" event={"ID":"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0","Type":"ContainerStarted","Data":"1189d4d97de40294d3fba3e19f4890b28617a7ead30a5f7bfe5052d0ddb9c0d5"} Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.663296 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.688166 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" event={"ID":"f55dc6de-bb5d-4221-a670-4b65c3992031","Type":"ContainerDied","Data":"1fd6d1666a2a28e854197a69e8ff0663d5e84dfc1816406495ad454e11a77296"} Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.688263 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd6d1666a2a28e854197a69e8ff0663d5e84dfc1816406495ad454e11a77296" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.693278 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" podStartSLOduration=132.693258121 podStartE2EDuration="2m12.693258121s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:17.689379662 +0000 UTC m=+153.320828430" watchObservedRunningTime="2025-11-27 16:06:17.693258121 +0000 UTC m=+153.324706889" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.688328 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.726162 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.730336 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:17 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:17 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:17 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.730404 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.840219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.840854 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.842809 4707 patch_prober.go:28] interesting pod/console-f9d7485db-5r8tf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Nov 27 16:06:17 crc kubenswrapper[4707]: I1127 16:06:17.842876 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5r8tf" podUID="36bf60c9-93cb-431f-9df1-1d3e245c49ef" containerName="console" probeResult="failure" output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" Nov 27 16:06:18 crc kubenswrapper[4707]: I1127 16:06:18.161481 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:06:18 crc kubenswrapper[4707]: I1127 16:06:18.203856 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h4m82" Nov 27 16:06:18 crc kubenswrapper[4707]: I1127 16:06:18.729581 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:18 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:18 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:18 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:18 crc kubenswrapper[4707]: I1127 16:06:18.729672 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.157528 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.324072 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4599ff2b-0390-49da-a99e-f003afa58144-kube-api-access\") pod \"4599ff2b-0390-49da-a99e-f003afa58144\" (UID: \"4599ff2b-0390-49da-a99e-f003afa58144\") " Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.324224 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4599ff2b-0390-49da-a99e-f003afa58144-kubelet-dir\") pod \"4599ff2b-0390-49da-a99e-f003afa58144\" (UID: \"4599ff2b-0390-49da-a99e-f003afa58144\") " Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.324722 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4599ff2b-0390-49da-a99e-f003afa58144-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4599ff2b-0390-49da-a99e-f003afa58144" (UID: "4599ff2b-0390-49da-a99e-f003afa58144"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.332480 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4599ff2b-0390-49da-a99e-f003afa58144-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4599ff2b-0390-49da-a99e-f003afa58144" (UID: "4599ff2b-0390-49da-a99e-f003afa58144"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.427430 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4599ff2b-0390-49da-a99e-f003afa58144-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.427466 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4599ff2b-0390-49da-a99e-f003afa58144-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.729984 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:19 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:19 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:19 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.730595 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.764118 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4599ff2b-0390-49da-a99e-f003afa58144","Type":"ContainerDied","Data":"42e6309af34514ee9660ef4c0918dc9e37a8d2e69d84cd7e93de747c480adc39"} Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.764177 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 16:06:19 crc kubenswrapper[4707]: I1127 16:06:19.764191 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42e6309af34514ee9660ef4c0918dc9e37a8d2e69d84cd7e93de747c480adc39" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.251658 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zg2ss" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.728619 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:20 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:20 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:20 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.728740 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.889944 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 16:06:20 crc kubenswrapper[4707]: E1127 16:06:20.891725 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4599ff2b-0390-49da-a99e-f003afa58144" containerName="pruner" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.891768 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4599ff2b-0390-49da-a99e-f003afa58144" containerName="pruner" Nov 27 16:06:20 crc kubenswrapper[4707]: E1127 16:06:20.891793 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f55dc6de-bb5d-4221-a670-4b65c3992031" containerName="collect-profiles" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.891801 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55dc6de-bb5d-4221-a670-4b65c3992031" containerName="collect-profiles" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.892126 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4599ff2b-0390-49da-a99e-f003afa58144" containerName="pruner" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.892143 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f55dc6de-bb5d-4221-a670-4b65c3992031" containerName="collect-profiles" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.892510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.898166 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.898837 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 27 16:06:20 crc kubenswrapper[4707]: I1127 16:06:20.903803 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.057309 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78280efc-fc57-4b8f-bb04-011a561fdd78-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78280efc-fc57-4b8f-bb04-011a561fdd78\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.057503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78280efc-fc57-4b8f-bb04-011a561fdd78-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78280efc-fc57-4b8f-bb04-011a561fdd78\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.159487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78280efc-fc57-4b8f-bb04-011a561fdd78-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78280efc-fc57-4b8f-bb04-011a561fdd78\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.159562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78280efc-fc57-4b8f-bb04-011a561fdd78-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78280efc-fc57-4b8f-bb04-011a561fdd78\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.159602 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78280efc-fc57-4b8f-bb04-011a561fdd78-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78280efc-fc57-4b8f-bb04-011a561fdd78\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.214747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78280efc-fc57-4b8f-bb04-011a561fdd78-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78280efc-fc57-4b8f-bb04-011a561fdd78\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.234927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.728473 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:21 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:21 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:21 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.728794 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:21 crc kubenswrapper[4707]: I1127 16:06:21.838217 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 16:06:22 crc kubenswrapper[4707]: I1127 16:06:22.522214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:22 crc kubenswrapper[4707]: I1127 16:06:22.529760 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-khskd" Nov 27 16:06:22 crc kubenswrapper[4707]: I1127 16:06:22.735587 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:22 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:22 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:22 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:22 crc kubenswrapper[4707]: I1127 16:06:22.735668 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:23 crc kubenswrapper[4707]: I1127 16:06:23.731689 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:23 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:23 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:23 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:23 crc kubenswrapper[4707]: I1127 16:06:23.731769 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:24 crc kubenswrapper[4707]: I1127 16:06:24.730517 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:24 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:24 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:24 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:24 crc kubenswrapper[4707]: I1127 16:06:24.730753 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:25 crc kubenswrapper[4707]: I1127 16:06:25.727403 4707 patch_prober.go:28] interesting pod/router-default-5444994796-tggrz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 16:06:25 crc kubenswrapper[4707]: [-]has-synced failed: reason withheld Nov 27 16:06:25 crc kubenswrapper[4707]: [+]process-running ok Nov 27 16:06:25 crc kubenswrapper[4707]: healthz check failed Nov 27 16:06:25 crc kubenswrapper[4707]: I1127 16:06:25.727456 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tggrz" podUID="b37b9b2e-05d2-434f-bd01-93cda5a05b52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:06:26 crc kubenswrapper[4707]: I1127 16:06:26.729535 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:26 crc kubenswrapper[4707]: I1127 16:06:26.732510 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tggrz" Nov 27 16:06:27 crc kubenswrapper[4707]: I1127 16:06:27.157248 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zdxsk" Nov 27 16:06:27 crc kubenswrapper[4707]: I1127 16:06:27.845642 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:27 crc kubenswrapper[4707]: I1127 16:06:27.849890 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:06:29 crc kubenswrapper[4707]: I1127 16:06:29.402503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:06:29 crc kubenswrapper[4707]: I1127 16:06:29.414566 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d382481-3c1e-49ed-8e27-265d495aa776-metrics-certs\") pod \"network-metrics-daemon-qcl5k\" (UID: \"7d382481-3c1e-49ed-8e27-265d495aa776\") " pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:06:29 crc kubenswrapper[4707]: I1127 16:06:29.439716 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qcl5k" Nov 27 16:06:31 crc kubenswrapper[4707]: W1127 16:06:31.505101 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod78280efc_fc57_4b8f_bb04_011a561fdd78.slice/crio-0107cf24bcdb8cfb2d2f1f1f5d99fd5a1a858140d4223203a2159c79153bee40 WatchSource:0}: Error finding container 0107cf24bcdb8cfb2d2f1f1f5d99fd5a1a858140d4223203a2159c79153bee40: Status 404 returned error can't find the container with id 0107cf24bcdb8cfb2d2f1f1f5d99fd5a1a858140d4223203a2159c79153bee40 Nov 27 16:06:31 crc kubenswrapper[4707]: I1127 16:06:31.878348 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78280efc-fc57-4b8f-bb04-011a561fdd78","Type":"ContainerStarted","Data":"0107cf24bcdb8cfb2d2f1f1f5d99fd5a1a858140d4223203a2159c79153bee40"} Nov 27 16:06:31 crc kubenswrapper[4707]: I1127 16:06:31.940820 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qcl5k"] Nov 27 16:06:32 crc kubenswrapper[4707]: I1127 16:06:32.885014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78280efc-fc57-4b8f-bb04-011a561fdd78","Type":"ContainerStarted","Data":"6850bfea39621043396374149623ecc4373468d85d12a1bee02fa7d061da8abf"} Nov 27 16:06:33 crc kubenswrapper[4707]: I1127 16:06:33.624110 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:06:33 crc kubenswrapper[4707]: I1127 16:06:33.624185 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:06:33 crc kubenswrapper[4707]: I1127 16:06:33.893072 4707 generic.go:334] "Generic (PLEG): container finished" podID="78280efc-fc57-4b8f-bb04-011a561fdd78" containerID="6850bfea39621043396374149623ecc4373468d85d12a1bee02fa7d061da8abf" exitCode=0 Nov 27 16:06:33 crc kubenswrapper[4707]: I1127 16:06:33.893147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78280efc-fc57-4b8f-bb04-011a561fdd78","Type":"ContainerDied","Data":"6850bfea39621043396374149623ecc4373468d85d12a1bee02fa7d061da8abf"} Nov 27 16:06:35 crc kubenswrapper[4707]: W1127 16:06:35.007841 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d382481_3c1e_49ed_8e27_265d495aa776.slice/crio-0f80c32721ea1222ffdbb25dbffbb60b19455823f84d709bffa67fa90e2316cd WatchSource:0}: Error finding container 0f80c32721ea1222ffdbb25dbffbb60b19455823f84d709bffa67fa90e2316cd: Status 404 returned error can't find the container with id 0f80c32721ea1222ffdbb25dbffbb60b19455823f84d709bffa67fa90e2316cd Nov 27 16:06:35 crc kubenswrapper[4707]: I1127 16:06:35.905521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" event={"ID":"7d382481-3c1e-49ed-8e27-265d495aa776","Type":"ContainerStarted","Data":"0f80c32721ea1222ffdbb25dbffbb60b19455823f84d709bffa67fa90e2316cd"} Nov 27 16:06:36 crc kubenswrapper[4707]: I1127 16:06:36.732434 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.480199 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.491239 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78280efc-fc57-4b8f-bb04-011a561fdd78-kubelet-dir\") pod \"78280efc-fc57-4b8f-bb04-011a561fdd78\" (UID: \"78280efc-fc57-4b8f-bb04-011a561fdd78\") " Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.491396 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78280efc-fc57-4b8f-bb04-011a561fdd78-kube-api-access\") pod \"78280efc-fc57-4b8f-bb04-011a561fdd78\" (UID: \"78280efc-fc57-4b8f-bb04-011a561fdd78\") " Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.491481 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78280efc-fc57-4b8f-bb04-011a561fdd78-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "78280efc-fc57-4b8f-bb04-011a561fdd78" (UID: "78280efc-fc57-4b8f-bb04-011a561fdd78"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.491728 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78280efc-fc57-4b8f-bb04-011a561fdd78-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.503518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78280efc-fc57-4b8f-bb04-011a561fdd78-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "78280efc-fc57-4b8f-bb04-011a561fdd78" (UID: "78280efc-fc57-4b8f-bb04-011a561fdd78"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.593758 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78280efc-fc57-4b8f-bb04-011a561fdd78-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.959563 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78280efc-fc57-4b8f-bb04-011a561fdd78","Type":"ContainerDied","Data":"0107cf24bcdb8cfb2d2f1f1f5d99fd5a1a858140d4223203a2159c79153bee40"} Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.959625 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0107cf24bcdb8cfb2d2f1f1f5d99fd5a1a858140d4223203a2159c79153bee40" Nov 27 16:06:43 crc kubenswrapper[4707]: I1127 16:06:43.959686 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 16:06:47 crc kubenswrapper[4707]: I1127 16:06:47.534069 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-p28h9" Nov 27 16:06:51 crc kubenswrapper[4707]: E1127 16:06:51.612793 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 27 16:06:51 crc kubenswrapper[4707]: E1127 16:06:51.613641 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wgst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xr65n_openshift-marketplace(22f274d1-48a8-4b2e-a80e-4163a9fa069e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:06:51 crc kubenswrapper[4707]: E1127 16:06:51.614827 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xr65n" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.360916 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xr65n" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.477575 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.477745 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r7q62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vfv9n_openshift-marketplace(ea8aba87-c190-4442-bd3f-5d2d9e2f38e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.481590 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vfv9n" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.493653 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.493811 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lz5k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-s5nbn_openshift-marketplace(fb266fde-0b2c-4866-8274-9ed2d4821c14): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.495760 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-s5nbn" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.503353 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.503736 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vv7fb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xhshk_openshift-marketplace(2b313184-19c4-42e4-b488-63f8e894feea): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.504917 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xhshk" podUID="2b313184-19c4-42e4-b488-63f8e894feea" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.529975 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.530117 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s9g5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-57fvk_openshift-marketplace(10e9e536-d557-44af-ae4d-7472fc20ee37): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:06:52 crc kubenswrapper[4707]: E1127 16:06:52.531769 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-57fvk" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.012249 4707 generic.go:334] "Generic (PLEG): container finished" podID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerID="1db60ce5b8699f929125f441672456b8c52b616da0313a737e027bc5c2b63241" exitCode=0 Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.012600 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkng" event={"ID":"2832c567-f82e-487a-8e20-f5d5b10168f3","Type":"ContainerDied","Data":"1db60ce5b8699f929125f441672456b8c52b616da0313a737e027bc5c2b63241"} Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.015199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" event={"ID":"7d382481-3c1e-49ed-8e27-265d495aa776","Type":"ContainerStarted","Data":"7faa0a5b824b5a9f23bb950a5c7fd3faab744c2c07dbe8403c587e952c7f1009"} Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.015245 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qcl5k" event={"ID":"7d382481-3c1e-49ed-8e27-265d495aa776","Type":"ContainerStarted","Data":"aa904c56977f5bf8f6cd404b7bfa9da17ffa1bb85b8022c53c6ce455d1de577a"} Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.017566 4707 generic.go:334] "Generic (PLEG): container finished" podID="bafb40a6-a701-4082-a791-65fce73e8669" containerID="8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba" exitCode=0 Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.017616 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dnk6z" event={"ID":"bafb40a6-a701-4082-a791-65fce73e8669","Type":"ContainerDied","Data":"8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba"} Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.022344 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerID="172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a" exitCode=0 Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.023362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d97n2" event={"ID":"f0d7e53b-3b6f-469d-b550-e61dbea7724f","Type":"ContainerDied","Data":"172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a"} Nov 27 16:06:53 crc kubenswrapper[4707]: E1127 16:06:53.026257 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-s5nbn" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" Nov 27 16:06:53 crc kubenswrapper[4707]: E1127 16:06:53.026307 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-57fvk" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" Nov 27 16:06:53 crc kubenswrapper[4707]: E1127 16:06:53.026342 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xhshk" podUID="2b313184-19c4-42e4-b488-63f8e894feea" Nov 27 16:06:53 crc kubenswrapper[4707]: E1127 16:06:53.028151 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vfv9n" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.054123 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qcl5k" podStartSLOduration=168.054098373 podStartE2EDuration="2m48.054098373s" podCreationTimestamp="2025-11-27 16:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:53.053272542 +0000 UTC m=+188.684721310" watchObservedRunningTime="2025-11-27 16:06:53.054098373 +0000 UTC m=+188.685547141" Nov 27 16:06:53 crc kubenswrapper[4707]: I1127 16:06:53.220081 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.029768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkng" event={"ID":"2832c567-f82e-487a-8e20-f5d5b10168f3","Type":"ContainerStarted","Data":"e1ed9522a4df2c9f9f86b13ca50d3fc17ad6642f4b022e23b3c1a4b199d0d0d5"} Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.033820 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dnk6z" event={"ID":"bafb40a6-a701-4082-a791-65fce73e8669","Type":"ContainerStarted","Data":"5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491"} Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.035654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d97n2" event={"ID":"f0d7e53b-3b6f-469d-b550-e61dbea7724f","Type":"ContainerStarted","Data":"a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93"} Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.051829 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nvkng" podStartSLOduration=3.674289098 podStartE2EDuration="42.051806379s" podCreationTimestamp="2025-11-27 16:06:12 +0000 UTC" firstStartedPulling="2025-11-27 16:06:15.37424876 +0000 UTC m=+151.005697528" lastFinishedPulling="2025-11-27 16:06:53.751766041 +0000 UTC m=+189.383214809" observedRunningTime="2025-11-27 16:06:54.047385346 +0000 UTC m=+189.678834134" watchObservedRunningTime="2025-11-27 16:06:54.051806379 +0000 UTC m=+189.683255147" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.066893 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dnk6z" podStartSLOduration=3.974405067 podStartE2EDuration="42.066875184s" podCreationTimestamp="2025-11-27 16:06:12 +0000 UTC" firstStartedPulling="2025-11-27 16:06:15.478431282 +0000 UTC m=+151.109880050" lastFinishedPulling="2025-11-27 16:06:53.570901359 +0000 UTC m=+189.202350167" observedRunningTime="2025-11-27 16:06:54.065739675 +0000 UTC m=+189.697188453" watchObservedRunningTime="2025-11-27 16:06:54.066875184 +0000 UTC m=+189.698323952" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.091866 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d97n2" podStartSLOduration=4.114830315 podStartE2EDuration="42.091844712s" podCreationTimestamp="2025-11-27 16:06:12 +0000 UTC" firstStartedPulling="2025-11-27 16:06:15.488692764 +0000 UTC m=+151.120141532" lastFinishedPulling="2025-11-27 16:06:53.465707131 +0000 UTC m=+189.097155929" observedRunningTime="2025-11-27 16:06:54.087678365 +0000 UTC m=+189.719127143" watchObservedRunningTime="2025-11-27 16:06:54.091844712 +0000 UTC m=+189.723293490" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.486535 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 16:06:54 crc kubenswrapper[4707]: E1127 16:06:54.486801 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78280efc-fc57-4b8f-bb04-011a561fdd78" containerName="pruner" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.486818 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78280efc-fc57-4b8f-bb04-011a561fdd78" containerName="pruner" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.486983 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="78280efc-fc57-4b8f-bb04-011a561fdd78" containerName="pruner" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.487483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.490593 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.490929 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.500453 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.662563 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.663130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.764655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.764792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.764885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.789684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:54 crc kubenswrapper[4707]: I1127 16:06:54.813574 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:55 crc kubenswrapper[4707]: I1127 16:06:55.288061 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 16:06:56 crc kubenswrapper[4707]: I1127 16:06:56.029561 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jxgm7"] Nov 27 16:06:56 crc kubenswrapper[4707]: I1127 16:06:56.046700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"138e4e6b-0e1c-438b-b7a7-f6486632a3ce","Type":"ContainerStarted","Data":"29a0c073a72baf7a90bb0828d58c28338c0b876fffca216d44e3e189e54fe574"} Nov 27 16:06:56 crc kubenswrapper[4707]: I1127 16:06:56.046743 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"138e4e6b-0e1c-438b-b7a7-f6486632a3ce","Type":"ContainerStarted","Data":"7b7fb3aea849e3abbaaaf5755a7243e6293e323647e3a89655e6ed3541243965"} Nov 27 16:06:56 crc kubenswrapper[4707]: I1127 16:06:56.068934 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.068915824 podStartE2EDuration="2.068915824s" podCreationTimestamp="2025-11-27 16:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:06:56.065524357 +0000 UTC m=+191.696973125" watchObservedRunningTime="2025-11-27 16:06:56.068915824 +0000 UTC m=+191.700364592" Nov 27 16:06:57 crc kubenswrapper[4707]: I1127 16:06:57.055690 4707 generic.go:334] "Generic (PLEG): container finished" podID="138e4e6b-0e1c-438b-b7a7-f6486632a3ce" containerID="29a0c073a72baf7a90bb0828d58c28338c0b876fffca216d44e3e189e54fe574" exitCode=0 Nov 27 16:06:57 crc kubenswrapper[4707]: I1127 16:06:57.055729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"138e4e6b-0e1c-438b-b7a7-f6486632a3ce","Type":"ContainerDied","Data":"29a0c073a72baf7a90bb0828d58c28338c0b876fffca216d44e3e189e54fe574"} Nov 27 16:06:58 crc kubenswrapper[4707]: I1127 16:06:58.370855 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:06:58 crc kubenswrapper[4707]: I1127 16:06:58.518014 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kubelet-dir\") pod \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\" (UID: \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\") " Nov 27 16:06:58 crc kubenswrapper[4707]: I1127 16:06:58.518061 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kube-api-access\") pod \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\" (UID: \"138e4e6b-0e1c-438b-b7a7-f6486632a3ce\") " Nov 27 16:06:58 crc kubenswrapper[4707]: I1127 16:06:58.518206 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "138e4e6b-0e1c-438b-b7a7-f6486632a3ce" (UID: "138e4e6b-0e1c-438b-b7a7-f6486632a3ce"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:06:58 crc kubenswrapper[4707]: I1127 16:06:58.518397 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:58 crc kubenswrapper[4707]: I1127 16:06:58.523096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "138e4e6b-0e1c-438b-b7a7-f6486632a3ce" (UID: "138e4e6b-0e1c-438b-b7a7-f6486632a3ce"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:06:58 crc kubenswrapper[4707]: I1127 16:06:58.619303 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/138e4e6b-0e1c-438b-b7a7-f6486632a3ce-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:06:59 crc kubenswrapper[4707]: I1127 16:06:59.067485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"138e4e6b-0e1c-438b-b7a7-f6486632a3ce","Type":"ContainerDied","Data":"7b7fb3aea849e3abbaaaf5755a7243e6293e323647e3a89655e6ed3541243965"} Nov 27 16:06:59 crc kubenswrapper[4707]: I1127 16:06:59.067523 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b7fb3aea849e3abbaaaf5755a7243e6293e323647e3a89655e6ed3541243965" Nov 27 16:06:59 crc kubenswrapper[4707]: I1127 16:06:59.067575 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.679185 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 16:07:01 crc kubenswrapper[4707]: E1127 16:07:01.679622 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138e4e6b-0e1c-438b-b7a7-f6486632a3ce" containerName="pruner" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.679632 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="138e4e6b-0e1c-438b-b7a7-f6486632a3ce" containerName="pruner" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.679725 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="138e4e6b-0e1c-438b-b7a7-f6486632a3ce" containerName="pruner" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.680067 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.685444 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.697981 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.702758 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.761722 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.761765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-var-lock\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.761829 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.863635 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.863686 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-var-lock\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.863760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.863850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.863880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-var-lock\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:01 crc kubenswrapper[4707]: I1127 16:07:01.880856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.002039 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.437295 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.651155 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.651199 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.773792 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.774184 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.812391 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.832065 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.957569 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.957622 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:07:02 crc kubenswrapper[4707]: I1127 16:07:02.996546 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:07:03 crc kubenswrapper[4707]: I1127 16:07:03.096221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c43ce123-cfbd-4c97-87d9-9b7144b8443c","Type":"ContainerStarted","Data":"7292858334cd71186cf1e15e24096a69d8d7e6dcc30e4202095c417d81505dae"} Nov 27 16:07:03 crc kubenswrapper[4707]: I1127 16:07:03.096277 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c43ce123-cfbd-4c97-87d9-9b7144b8443c","Type":"ContainerStarted","Data":"094a9715f936bc11fc44597e99e6b6ea5b912015c7c1fcf921b03ac0ea361860"} Nov 27 16:07:03 crc kubenswrapper[4707]: I1127 16:07:03.115928 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.115908842 podStartE2EDuration="2.115908842s" podCreationTimestamp="2025-11-27 16:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:07:03.108626642 +0000 UTC m=+198.740075420" watchObservedRunningTime="2025-11-27 16:07:03.115908842 +0000 UTC m=+198.747357630" Nov 27 16:07:03 crc kubenswrapper[4707]: I1127 16:07:03.141568 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:07:03 crc kubenswrapper[4707]: I1127 16:07:03.143952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:07:03 crc kubenswrapper[4707]: I1127 16:07:03.168429 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:07:03 crc kubenswrapper[4707]: I1127 16:07:03.623752 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:07:03 crc kubenswrapper[4707]: I1127 16:07:03.623847 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:07:04 crc kubenswrapper[4707]: I1127 16:07:04.844766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nvkng"] Nov 27 16:07:05 crc kubenswrapper[4707]: I1127 16:07:05.107864 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nvkng" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerName="registry-server" containerID="cri-o://e1ed9522a4df2c9f9f86b13ca50d3fc17ad6642f4b022e23b3c1a4b199d0d0d5" gracePeriod=2 Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.114406 4707 generic.go:334] "Generic (PLEG): container finished" podID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerID="e1ed9522a4df2c9f9f86b13ca50d3fc17ad6642f4b022e23b3c1a4b199d0d0d5" exitCode=0 Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.114444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkng" event={"ID":"2832c567-f82e-487a-8e20-f5d5b10168f3","Type":"ContainerDied","Data":"e1ed9522a4df2c9f9f86b13ca50d3fc17ad6642f4b022e23b3c1a4b199d0d0d5"} Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.771495 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.934949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-utilities\") pod \"2832c567-f82e-487a-8e20-f5d5b10168f3\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.936068 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-utilities" (OuterVolumeSpecName: "utilities") pod "2832c567-f82e-487a-8e20-f5d5b10168f3" (UID: "2832c567-f82e-487a-8e20-f5d5b10168f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.936214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-catalog-content\") pod \"2832c567-f82e-487a-8e20-f5d5b10168f3\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.950641 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l85hg\" (UniqueName: \"kubernetes.io/projected/2832c567-f82e-487a-8e20-f5d5b10168f3-kube-api-access-l85hg\") pod \"2832c567-f82e-487a-8e20-f5d5b10168f3\" (UID: \"2832c567-f82e-487a-8e20-f5d5b10168f3\") " Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.951106 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:06 crc kubenswrapper[4707]: I1127 16:07:06.960593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2832c567-f82e-487a-8e20-f5d5b10168f3-kube-api-access-l85hg" (OuterVolumeSpecName: "kube-api-access-l85hg") pod "2832c567-f82e-487a-8e20-f5d5b10168f3" (UID: "2832c567-f82e-487a-8e20-f5d5b10168f3"). InnerVolumeSpecName "kube-api-access-l85hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.003067 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2832c567-f82e-487a-8e20-f5d5b10168f3" (UID: "2832c567-f82e-487a-8e20-f5d5b10168f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.052027 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2832c567-f82e-487a-8e20-f5d5b10168f3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.052067 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l85hg\" (UniqueName: \"kubernetes.io/projected/2832c567-f82e-487a-8e20-f5d5b10168f3-kube-api-access-l85hg\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.120285 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr65n" event={"ID":"22f274d1-48a8-4b2e-a80e-4163a9fa069e","Type":"ContainerStarted","Data":"273e7d37b7fbb9ddca1feedc6754ba33abf58a1bd49c23a48c97c20461d0b7b5"} Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.126221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nvkng" event={"ID":"2832c567-f82e-487a-8e20-f5d5b10168f3","Type":"ContainerDied","Data":"a12f448491630f1a08ed692225a7a9d2f243be97286502ae677ca3b99a8d0b8e"} Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.126289 4707 scope.go:117] "RemoveContainer" containerID="e1ed9522a4df2c9f9f86b13ca50d3fc17ad6642f4b022e23b3c1a4b199d0d0d5" Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.126494 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nvkng" Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.136155 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nbn" event={"ID":"fb266fde-0b2c-4866-8274-9ed2d4821c14","Type":"ContainerStarted","Data":"b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1"} Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.154381 4707 scope.go:117] "RemoveContainer" containerID="1db60ce5b8699f929125f441672456b8c52b616da0313a737e027bc5c2b63241" Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.194429 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nvkng"] Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.203314 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nvkng"] Nov 27 16:07:07 crc kubenswrapper[4707]: I1127 16:07:07.203691 4707 scope.go:117] "RemoveContainer" containerID="18f05b2cc342f9dde250ff35d25badbed221136876b8c3e913b681fa5db70a55" Nov 27 16:07:07 crc kubenswrapper[4707]: E1127 16:07:07.243632 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2832c567_f82e_487a_8e20_f5d5b10168f3.slice/crio-a12f448491630f1a08ed692225a7a9d2f243be97286502ae677ca3b99a8d0b8e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2832c567_f82e_487a_8e20_f5d5b10168f3.slice\": RecentStats: unable to find data in memory cache]" Nov 27 16:07:08 crc kubenswrapper[4707]: I1127 16:07:08.147025 4707 generic.go:334] "Generic (PLEG): container finished" podID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerID="273e7d37b7fbb9ddca1feedc6754ba33abf58a1bd49c23a48c97c20461d0b7b5" exitCode=0 Nov 27 16:07:08 crc kubenswrapper[4707]: I1127 16:07:08.147357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr65n" event={"ID":"22f274d1-48a8-4b2e-a80e-4163a9fa069e","Type":"ContainerDied","Data":"273e7d37b7fbb9ddca1feedc6754ba33abf58a1bd49c23a48c97c20461d0b7b5"} Nov 27 16:07:08 crc kubenswrapper[4707]: I1127 16:07:08.149142 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerID="b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1" exitCode=0 Nov 27 16:07:08 crc kubenswrapper[4707]: I1127 16:07:08.149210 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nbn" event={"ID":"fb266fde-0b2c-4866-8274-9ed2d4821c14","Type":"ContainerDied","Data":"b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1"} Nov 27 16:07:09 crc kubenswrapper[4707]: I1127 16:07:09.154221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57fvk" event={"ID":"10e9e536-d557-44af-ae4d-7472fc20ee37","Type":"ContainerStarted","Data":"4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf"} Nov 27 16:07:09 crc kubenswrapper[4707]: I1127 16:07:09.156815 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr65n" event={"ID":"22f274d1-48a8-4b2e-a80e-4163a9fa069e","Type":"ContainerStarted","Data":"028cf21d1cc166f04cef598fbb560014503d617c23c6d6c0a207226a961f0a27"} Nov 27 16:07:09 crc kubenswrapper[4707]: I1127 16:07:09.159554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nbn" event={"ID":"fb266fde-0b2c-4866-8274-9ed2d4821c14","Type":"ContainerStarted","Data":"ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480"} Nov 27 16:07:09 crc kubenswrapper[4707]: I1127 16:07:09.190348 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s5nbn" podStartSLOduration=2.105753605 podStartE2EDuration="54.190335668s" podCreationTimestamp="2025-11-27 16:06:15 +0000 UTC" firstStartedPulling="2025-11-27 16:06:16.578592536 +0000 UTC m=+152.210041304" lastFinishedPulling="2025-11-27 16:07:08.663174599 +0000 UTC m=+204.294623367" observedRunningTime="2025-11-27 16:07:09.189544919 +0000 UTC m=+204.820993677" watchObservedRunningTime="2025-11-27 16:07:09.190335668 +0000 UTC m=+204.821784436" Nov 27 16:07:09 crc kubenswrapper[4707]: I1127 16:07:09.200880 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" path="/var/lib/kubelet/pods/2832c567-f82e-487a-8e20-f5d5b10168f3/volumes" Nov 27 16:07:09 crc kubenswrapper[4707]: I1127 16:07:09.211199 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xr65n" podStartSLOduration=3.190302412 podStartE2EDuration="54.211189653s" podCreationTimestamp="2025-11-27 16:06:15 +0000 UTC" firstStartedPulling="2025-11-27 16:06:17.665660215 +0000 UTC m=+153.297108983" lastFinishedPulling="2025-11-27 16:07:08.686547456 +0000 UTC m=+204.317996224" observedRunningTime="2025-11-27 16:07:09.209691356 +0000 UTC m=+204.841140124" watchObservedRunningTime="2025-11-27 16:07:09.211189653 +0000 UTC m=+204.842638411" Nov 27 16:07:10 crc kubenswrapper[4707]: I1127 16:07:10.190072 4707 generic.go:334] "Generic (PLEG): container finished" podID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerID="ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd" exitCode=0 Nov 27 16:07:10 crc kubenswrapper[4707]: I1127 16:07:10.190281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfv9n" event={"ID":"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4","Type":"ContainerDied","Data":"ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd"} Nov 27 16:07:10 crc kubenswrapper[4707]: I1127 16:07:10.195084 4707 generic.go:334] "Generic (PLEG): container finished" podID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerID="4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf" exitCode=0 Nov 27 16:07:10 crc kubenswrapper[4707]: I1127 16:07:10.195131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57fvk" event={"ID":"10e9e536-d557-44af-ae4d-7472fc20ee37","Type":"ContainerDied","Data":"4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf"} Nov 27 16:07:11 crc kubenswrapper[4707]: I1127 16:07:11.202475 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfv9n" event={"ID":"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4","Type":"ContainerStarted","Data":"82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6"} Nov 27 16:07:11 crc kubenswrapper[4707]: I1127 16:07:11.204925 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b313184-19c4-42e4-b488-63f8e894feea" containerID="75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e" exitCode=0 Nov 27 16:07:11 crc kubenswrapper[4707]: I1127 16:07:11.204979 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhshk" event={"ID":"2b313184-19c4-42e4-b488-63f8e894feea","Type":"ContainerDied","Data":"75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e"} Nov 27 16:07:11 crc kubenswrapper[4707]: I1127 16:07:11.224729 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vfv9n" podStartSLOduration=3.2225019599999998 podStartE2EDuration="57.224709834s" podCreationTimestamp="2025-11-27 16:06:14 +0000 UTC" firstStartedPulling="2025-11-27 16:06:16.625750311 +0000 UTC m=+152.257199079" lastFinishedPulling="2025-11-27 16:07:10.627958175 +0000 UTC m=+206.259406953" observedRunningTime="2025-11-27 16:07:11.224295444 +0000 UTC m=+206.855744212" watchObservedRunningTime="2025-11-27 16:07:11.224709834 +0000 UTC m=+206.856158612" Nov 27 16:07:12 crc kubenswrapper[4707]: I1127 16:07:12.213235 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57fvk" event={"ID":"10e9e536-d557-44af-ae4d-7472fc20ee37","Type":"ContainerStarted","Data":"7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40"} Nov 27 16:07:12 crc kubenswrapper[4707]: I1127 16:07:12.240256 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-57fvk" podStartSLOduration=5.641575293 podStartE2EDuration="1m0.240235926s" podCreationTimestamp="2025-11-27 16:06:12 +0000 UTC" firstStartedPulling="2025-11-27 16:06:16.570496549 +0000 UTC m=+152.201945317" lastFinishedPulling="2025-11-27 16:07:11.169157182 +0000 UTC m=+206.800605950" observedRunningTime="2025-11-27 16:07:12.238725658 +0000 UTC m=+207.870174426" watchObservedRunningTime="2025-11-27 16:07:12.240235926 +0000 UTC m=+207.871684704" Nov 27 16:07:13 crc kubenswrapper[4707]: I1127 16:07:13.382093 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:07:13 crc kubenswrapper[4707]: I1127 16:07:13.384763 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:07:13 crc kubenswrapper[4707]: I1127 16:07:13.449484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:07:15 crc kubenswrapper[4707]: I1127 16:07:15.235278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhshk" event={"ID":"2b313184-19c4-42e4-b488-63f8e894feea","Type":"ContainerStarted","Data":"ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e"} Nov 27 16:07:15 crc kubenswrapper[4707]: I1127 16:07:15.266698 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xhshk" podStartSLOduration=4.116929898 podStartE2EDuration="1m1.266641651s" podCreationTimestamp="2025-11-27 16:06:14 +0000 UTC" firstStartedPulling="2025-11-27 16:06:16.610983634 +0000 UTC m=+152.242432402" lastFinishedPulling="2025-11-27 16:07:13.760695367 +0000 UTC m=+209.392144155" observedRunningTime="2025-11-27 16:07:15.259236939 +0000 UTC m=+210.890685777" watchObservedRunningTime="2025-11-27 16:07:15.266641651 +0000 UTC m=+210.898090469" Nov 27 16:07:15 crc kubenswrapper[4707]: I1127 16:07:15.294177 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:07:15 crc kubenswrapper[4707]: I1127 16:07:15.294309 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:07:15 crc kubenswrapper[4707]: I1127 16:07:15.370865 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:07:15 crc kubenswrapper[4707]: I1127 16:07:15.888616 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:07:15 crc kubenswrapper[4707]: I1127 16:07:15.888698 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:07:15 crc kubenswrapper[4707]: I1127 16:07:15.945478 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:07:16 crc kubenswrapper[4707]: I1127 16:07:16.163185 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:07:16 crc kubenswrapper[4707]: I1127 16:07:16.163284 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:07:16 crc kubenswrapper[4707]: I1127 16:07:16.231926 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:07:16 crc kubenswrapper[4707]: I1127 16:07:16.325324 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:07:16 crc kubenswrapper[4707]: I1127 16:07:16.362485 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:07:16 crc kubenswrapper[4707]: I1127 16:07:16.372438 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:07:17 crc kubenswrapper[4707]: I1127 16:07:17.451032 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfv9n"] Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.262928 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vfv9n" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerName="registry-server" containerID="cri-o://82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6" gracePeriod=2 Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.700832 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.847702 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xr65n"] Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.848056 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xr65n" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerName="registry-server" containerID="cri-o://028cf21d1cc166f04cef598fbb560014503d617c23c6d6c0a207226a961f0a27" gracePeriod=2 Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.866932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-catalog-content\") pod \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.867076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-utilities\") pod \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.867139 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7q62\" (UniqueName: \"kubernetes.io/projected/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-kube-api-access-r7q62\") pod \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\" (UID: \"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4\") " Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.868953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-utilities" (OuterVolumeSpecName: "utilities") pod "ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" (UID: "ea8aba87-c190-4442-bd3f-5d2d9e2f38e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.874300 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-kube-api-access-r7q62" (OuterVolumeSpecName: "kube-api-access-r7q62") pod "ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" (UID: "ea8aba87-c190-4442-bd3f-5d2d9e2f38e4"). InnerVolumeSpecName "kube-api-access-r7q62". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.907612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" (UID: "ea8aba87-c190-4442-bd3f-5d2d9e2f38e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.968261 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.968293 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:19 crc kubenswrapper[4707]: I1127 16:07:19.968306 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7q62\" (UniqueName: \"kubernetes.io/projected/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4-kube-api-access-r7q62\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.277742 4707 generic.go:334] "Generic (PLEG): container finished" podID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerID="82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6" exitCode=0 Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.277819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfv9n" event={"ID":"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4","Type":"ContainerDied","Data":"82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6"} Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.277839 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vfv9n" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.277869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vfv9n" event={"ID":"ea8aba87-c190-4442-bd3f-5d2d9e2f38e4","Type":"ContainerDied","Data":"e516ea2d56d317b04a415c88d372b50864ee3bf7dd5283a317ad930d4465f053"} Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.277907 4707 scope.go:117] "RemoveContainer" containerID="82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.305788 4707 scope.go:117] "RemoveContainer" containerID="ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.333444 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfv9n"] Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.338896 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vfv9n"] Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.340413 4707 scope.go:117] "RemoveContainer" containerID="933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.364758 4707 scope.go:117] "RemoveContainer" containerID="82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6" Nov 27 16:07:20 crc kubenswrapper[4707]: E1127 16:07:20.365274 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6\": container with ID starting with 82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6 not found: ID does not exist" containerID="82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.365515 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6"} err="failed to get container status \"82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6\": rpc error: code = NotFound desc = could not find container \"82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6\": container with ID starting with 82686fed4a99fd4966e47606dc35ef1cf90d3b0abb0fbe4b6eb9f11d8eea01a6 not found: ID does not exist" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.365739 4707 scope.go:117] "RemoveContainer" containerID="ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd" Nov 27 16:07:20 crc kubenswrapper[4707]: E1127 16:07:20.366490 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd\": container with ID starting with ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd not found: ID does not exist" containerID="ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.366543 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd"} err="failed to get container status \"ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd\": rpc error: code = NotFound desc = could not find container \"ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd\": container with ID starting with ee2bd4d06b21c19a1f6399821c816c36adfcef2f7499896d7ed2ba87a59f2bbd not found: ID does not exist" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.366585 4707 scope.go:117] "RemoveContainer" containerID="933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a" Nov 27 16:07:20 crc kubenswrapper[4707]: E1127 16:07:20.367228 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a\": container with ID starting with 933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a not found: ID does not exist" containerID="933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a" Nov 27 16:07:20 crc kubenswrapper[4707]: I1127 16:07:20.367439 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a"} err="failed to get container status \"933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a\": rpc error: code = NotFound desc = could not find container \"933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a\": container with ID starting with 933603ddecef30d3a2ab56586105b0709a10841713808e143fb233808a7b271a not found: ID does not exist" Nov 27 16:07:21 crc kubenswrapper[4707]: I1127 16:07:21.058524 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" podUID="3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" containerName="oauth-openshift" containerID="cri-o://1b2502aea9b75734859c51be22c13dff311965d64b0a82b9d83678094b25fb90" gracePeriod=15 Nov 27 16:07:21 crc kubenswrapper[4707]: I1127 16:07:21.207117 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" path="/var/lib/kubelet/pods/ea8aba87-c190-4442-bd3f-5d2d9e2f38e4/volumes" Nov 27 16:07:21 crc kubenswrapper[4707]: I1127 16:07:21.294754 4707 generic.go:334] "Generic (PLEG): container finished" podID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerID="028cf21d1cc166f04cef598fbb560014503d617c23c6d6c0a207226a961f0a27" exitCode=0 Nov 27 16:07:21 crc kubenswrapper[4707]: I1127 16:07:21.294816 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr65n" event={"ID":"22f274d1-48a8-4b2e-a80e-4163a9fa069e","Type":"ContainerDied","Data":"028cf21d1cc166f04cef598fbb560014503d617c23c6d6c0a207226a961f0a27"} Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.304047 4707 generic.go:334] "Generic (PLEG): container finished" podID="3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" containerID="1b2502aea9b75734859c51be22c13dff311965d64b0a82b9d83678094b25fb90" exitCode=0 Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.304106 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" event={"ID":"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf","Type":"ContainerDied","Data":"1b2502aea9b75734859c51be22c13dff311965d64b0a82b9d83678094b25fb90"} Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.557010 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.609295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-catalog-content\") pod \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.609445 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wgst\" (UniqueName: \"kubernetes.io/projected/22f274d1-48a8-4b2e-a80e-4163a9fa069e-kube-api-access-8wgst\") pod \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.609486 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-utilities\") pod \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\" (UID: \"22f274d1-48a8-4b2e-a80e-4163a9fa069e\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.610217 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-utilities" (OuterVolumeSpecName: "utilities") pod "22f274d1-48a8-4b2e-a80e-4163a9fa069e" (UID: "22f274d1-48a8-4b2e-a80e-4163a9fa069e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.614330 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f274d1-48a8-4b2e-a80e-4163a9fa069e-kube-api-access-8wgst" (OuterVolumeSpecName: "kube-api-access-8wgst") pod "22f274d1-48a8-4b2e-a80e-4163a9fa069e" (UID: "22f274d1-48a8-4b2e-a80e-4163a9fa069e"). InnerVolumeSpecName "kube-api-access-8wgst". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.662556 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.710817 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wgst\" (UniqueName: \"kubernetes.io/projected/22f274d1-48a8-4b2e-a80e-4163a9fa069e-kube-api-access-8wgst\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.710850 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.811813 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-router-certs\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.811865 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-session\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.811892 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-idp-0-file-data\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.811921 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-serving-cert\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.811979 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-ocp-branding-template\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812012 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-policies\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-login\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812084 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-service-ca\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-cliconfig\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-trusted-ca-bundle\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-dir\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-provider-selection\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812247 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zgc6\" (UniqueName: \"kubernetes.io/projected/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-kube-api-access-5zgc6\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812308 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-error\") pod \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\" (UID: \"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf\") " Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.812834 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.813571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.813607 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.813654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.814308 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.815516 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.816035 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.816237 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.816507 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.819969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.820008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-kube-api-access-5zgc6" (OuterVolumeSpecName: "kube-api-access-5zgc6") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "kube-api-access-5zgc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.820129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.821918 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.830667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" (UID: "3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.913918 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914147 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914209 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914285 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914342 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914418 4707 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914486 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914542 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914595 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914662 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914717 4707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914772 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914828 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zgc6\" (UniqueName: \"kubernetes.io/projected/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-kube-api-access-5zgc6\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:22 crc kubenswrapper[4707]: I1127 16:07:22.914891 4707 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.053855 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22f274d1-48a8-4b2e-a80e-4163a9fa069e" (UID: "22f274d1-48a8-4b2e-a80e-4163a9fa069e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.118497 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f274d1-48a8-4b2e-a80e-4163a9fa069e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.316127 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xr65n" event={"ID":"22f274d1-48a8-4b2e-a80e-4163a9fa069e","Type":"ContainerDied","Data":"9d251cb0b8a97801b6ac3f22df4299e8beef2b6e5d29a13693c813f43fa771e0"} Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.316251 4707 scope.go:117] "RemoveContainer" containerID="028cf21d1cc166f04cef598fbb560014503d617c23c6d6c0a207226a961f0a27" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.316435 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xr65n" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.320307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" event={"ID":"3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf","Type":"ContainerDied","Data":"8d98173d610ce56df91ded86fe8e461d5e44b6d95c467d0f07607215081f4dcd"} Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.320446 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jxgm7" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.348957 4707 scope.go:117] "RemoveContainer" containerID="273e7d37b7fbb9ddca1feedc6754ba33abf58a1bd49c23a48c97c20461d0b7b5" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.363335 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xr65n"] Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.369538 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xr65n"] Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.377611 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jxgm7"] Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.383554 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jxgm7"] Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.387734 4707 scope.go:117] "RemoveContainer" containerID="d6d3f96814f614e7cb456adc4d7c795305cd729a46243ea889046772bb72d413" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.407548 4707 scope.go:117] "RemoveContainer" containerID="1b2502aea9b75734859c51be22c13dff311965d64b0a82b9d83678094b25fb90" Nov 27 16:07:23 crc kubenswrapper[4707]: I1127 16:07:23.456867 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:07:24 crc kubenswrapper[4707]: I1127 16:07:24.914041 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:07:24 crc kubenswrapper[4707]: I1127 16:07:24.914294 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:07:24 crc kubenswrapper[4707]: I1127 16:07:24.965245 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:07:25 crc kubenswrapper[4707]: I1127 16:07:25.202791 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" path="/var/lib/kubelet/pods/22f274d1-48a8-4b2e-a80e-4163a9fa069e/volumes" Nov 27 16:07:25 crc kubenswrapper[4707]: I1127 16:07:25.204542 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" path="/var/lib/kubelet/pods/3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf/volumes" Nov 27 16:07:25 crc kubenswrapper[4707]: I1127 16:07:25.419672 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:07:25 crc kubenswrapper[4707]: I1127 16:07:25.850792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-57fvk"] Nov 27 16:07:25 crc kubenswrapper[4707]: I1127 16:07:25.851146 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-57fvk" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerName="registry-server" containerID="cri-o://7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40" gracePeriod=2 Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.301988 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.359895 4707 generic.go:334] "Generic (PLEG): container finished" podID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerID="7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40" exitCode=0 Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.360523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57fvk" event={"ID":"10e9e536-d557-44af-ae4d-7472fc20ee37","Type":"ContainerDied","Data":"7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40"} Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.360614 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57fvk" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.360646 4707 scope.go:117] "RemoveContainer" containerID="7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.360628 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57fvk" event={"ID":"10e9e536-d557-44af-ae4d-7472fc20ee37","Type":"ContainerDied","Data":"fec1dedbb4165174711d46771f90496c2facef1ca99b15743760606f71d921c6"} Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.368777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9g5w\" (UniqueName: \"kubernetes.io/projected/10e9e536-d557-44af-ae4d-7472fc20ee37-kube-api-access-s9g5w\") pod \"10e9e536-d557-44af-ae4d-7472fc20ee37\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.368867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-catalog-content\") pod \"10e9e536-d557-44af-ae4d-7472fc20ee37\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.368917 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-utilities\") pod \"10e9e536-d557-44af-ae4d-7472fc20ee37\" (UID: \"10e9e536-d557-44af-ae4d-7472fc20ee37\") " Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.369823 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-utilities" (OuterVolumeSpecName: "utilities") pod "10e9e536-d557-44af-ae4d-7472fc20ee37" (UID: "10e9e536-d557-44af-ae4d-7472fc20ee37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.381390 4707 scope.go:117] "RemoveContainer" containerID="4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.387202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e9e536-d557-44af-ae4d-7472fc20ee37-kube-api-access-s9g5w" (OuterVolumeSpecName: "kube-api-access-s9g5w") pod "10e9e536-d557-44af-ae4d-7472fc20ee37" (UID: "10e9e536-d557-44af-ae4d-7472fc20ee37"). InnerVolumeSpecName "kube-api-access-s9g5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.397521 4707 scope.go:117] "RemoveContainer" containerID="e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.421441 4707 scope.go:117] "RemoveContainer" containerID="7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40" Nov 27 16:07:26 crc kubenswrapper[4707]: E1127 16:07:26.421932 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40\": container with ID starting with 7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40 not found: ID does not exist" containerID="7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.421977 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40"} err="failed to get container status \"7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40\": rpc error: code = NotFound desc = could not find container \"7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40\": container with ID starting with 7072c5ee2cdff715fac4399a36ad6275d925dbdb7f3eadc56ab15edeb695da40 not found: ID does not exist" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.422001 4707 scope.go:117] "RemoveContainer" containerID="4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf" Nov 27 16:07:26 crc kubenswrapper[4707]: E1127 16:07:26.422393 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf\": container with ID starting with 4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf not found: ID does not exist" containerID="4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.422414 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf"} err="failed to get container status \"4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf\": rpc error: code = NotFound desc = could not find container \"4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf\": container with ID starting with 4606ce324026f2348c84b7f1fe6cbec6381ad5afd5c3310e2e517f7f3a5362cf not found: ID does not exist" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.422426 4707 scope.go:117] "RemoveContainer" containerID="e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26" Nov 27 16:07:26 crc kubenswrapper[4707]: E1127 16:07:26.422645 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26\": container with ID starting with e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26 not found: ID does not exist" containerID="e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.422663 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26"} err="failed to get container status \"e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26\": rpc error: code = NotFound desc = could not find container \"e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26\": container with ID starting with e93cbfe868f5e4777bd0769b6e1dca827960ea9507d9c7025c22f72fb0392f26 not found: ID does not exist" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.422695 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10e9e536-d557-44af-ae4d-7472fc20ee37" (UID: "10e9e536-d557-44af-ae4d-7472fc20ee37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.470543 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9g5w\" (UniqueName: \"kubernetes.io/projected/10e9e536-d557-44af-ae4d-7472fc20ee37-kube-api-access-s9g5w\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.470584 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.470593 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10e9e536-d557-44af-ae4d-7472fc20ee37-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.689590 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-57fvk"] Nov 27 16:07:26 crc kubenswrapper[4707]: I1127 16:07:26.693135 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-57fvk"] Nov 27 16:07:27 crc kubenswrapper[4707]: I1127 16:07:27.208746 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" path="/var/lib/kubelet/pods/10e9e536-d557-44af-ae4d-7472fc20ee37/volumes" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.627691 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-54b5c98c4-b44q7"] Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.627938 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.627955 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.627970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.627978 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.627989 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.627998 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerName="extract-content" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628022 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerName="extract-content" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628033 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" containerName="oauth-openshift" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628042 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" containerName="oauth-openshift" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628053 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerName="extract-utilities" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628063 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerName="extract-utilities" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628075 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerName="extract-utilities" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628083 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerName="extract-utilities" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628121 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628130 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628139 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerName="extract-utilities" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628146 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerName="extract-utilities" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628157 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerName="extract-content" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628164 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerName="extract-content" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628175 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerName="extract-content" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628182 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerName="extract-content" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628191 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerName="extract-utilities" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628199 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerName="extract-utilities" Nov 27 16:07:28 crc kubenswrapper[4707]: E1127 16:07:28.628211 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerName="extract-content" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628219 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerName="extract-content" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628334 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f274d1-48a8-4b2e-a80e-4163a9fa069e" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628348 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2832c567-f82e-487a-8e20-f5d5b10168f3" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628362 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e9e536-d557-44af-ae4d-7472fc20ee37" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628389 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8aba87-c190-4442-bd3f-5d2d9e2f38e4" containerName="registry-server" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628403 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fdf4f4a-233e-409a-8cd7-dcd6efd5b2cf" containerName="oauth-openshift" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.628820 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.631329 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.631350 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.631634 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.631642 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.631917 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.634250 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.634358 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.634897 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.634968 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.635014 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.635198 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.636459 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.653957 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54b5c98c4-b44q7"] Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.656394 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.658214 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.667011 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707616 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-audit-dir\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-audit-policies\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707963 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.707989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-session\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.708013 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.708027 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.708042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t6sh\" (UniqueName: \"kubernetes.io/projected/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-kube-api-access-8t6sh\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809288 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809722 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809783 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-session\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809840 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t6sh\" (UniqueName: \"kubernetes.io/projected/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-kube-api-access-8t6sh\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809891 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-audit-dir\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.809987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-audit-policies\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.810012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.812104 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-audit-dir\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.813117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-service-ca\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.813160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-audit-policies\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.813176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.813850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.815408 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-router-certs\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.816309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.816683 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-login\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.817876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-template-error\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.818638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-session\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.819167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.822694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.823092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.830519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t6sh\" (UniqueName: \"kubernetes.io/projected/6dc7e00e-4c59-4e40-9d77-c4d393bceb70-kube-api-access-8t6sh\") pod \"oauth-openshift-54b5c98c4-b44q7\" (UID: \"6dc7e00e-4c59-4e40-9d77-c4d393bceb70\") " pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:28 crc kubenswrapper[4707]: I1127 16:07:28.943910 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:29 crc kubenswrapper[4707]: I1127 16:07:29.355195 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54b5c98c4-b44q7"] Nov 27 16:07:29 crc kubenswrapper[4707]: I1127 16:07:29.391102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" event={"ID":"6dc7e00e-4c59-4e40-9d77-c4d393bceb70","Type":"ContainerStarted","Data":"be5d92d0dd6d5e9c3f3c6b25a44f5632fdcce2d120f20446700d8fc22777f293"} Nov 27 16:07:30 crc kubenswrapper[4707]: I1127 16:07:30.396838 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" event={"ID":"6dc7e00e-4c59-4e40-9d77-c4d393bceb70","Type":"ContainerStarted","Data":"1a6557b2d0f43e59d3093c118fb851d2e8813d1ee7dc84b60e2216dd191523f6"} Nov 27 16:07:30 crc kubenswrapper[4707]: I1127 16:07:30.397063 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:30 crc kubenswrapper[4707]: I1127 16:07:30.404219 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" Nov 27 16:07:30 crc kubenswrapper[4707]: I1127 16:07:30.420039 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-54b5c98c4-b44q7" podStartSLOduration=34.420018571 podStartE2EDuration="34.420018571s" podCreationTimestamp="2025-11-27 16:06:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:07:30.418693408 +0000 UTC m=+226.050142176" watchObservedRunningTime="2025-11-27 16:07:30.420018571 +0000 UTC m=+226.051467339" Nov 27 16:07:33 crc kubenswrapper[4707]: I1127 16:07:33.623738 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:07:33 crc kubenswrapper[4707]: I1127 16:07:33.624225 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:07:33 crc kubenswrapper[4707]: I1127 16:07:33.624332 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:07:33 crc kubenswrapper[4707]: I1127 16:07:33.625263 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:07:33 crc kubenswrapper[4707]: I1127 16:07:33.625351 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5" gracePeriod=600 Nov 27 16:07:34 crc kubenswrapper[4707]: I1127 16:07:34.422242 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5" exitCode=0 Nov 27 16:07:34 crc kubenswrapper[4707]: I1127 16:07:34.422338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5"} Nov 27 16:07:34 crc kubenswrapper[4707]: I1127 16:07:34.423005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"31f688d6285990a2c90910fc7e1b42b1f5156cb6f243c758badf144c47b276ff"} Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.569576 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.571392 4707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.571559 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.571760 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce" gracePeriod=15 Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.571864 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228" gracePeriod=15 Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.571790 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a" gracePeriod=15 Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.572018 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277" gracePeriod=15 Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.571849 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8" gracePeriod=15 Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.572501 4707 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:07:40 crc kubenswrapper[4707]: E1127 16:07:40.572685 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.572757 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 27 16:07:40 crc kubenswrapper[4707]: E1127 16:07:40.572815 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.572874 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 16:07:40 crc kubenswrapper[4707]: E1127 16:07:40.572932 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.572989 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 16:07:40 crc kubenswrapper[4707]: E1127 16:07:40.573809 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.573869 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:07:40 crc kubenswrapper[4707]: E1127 16:07:40.573935 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.573992 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 16:07:40 crc kubenswrapper[4707]: E1127 16:07:40.574049 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.574103 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:07:40 crc kubenswrapper[4707]: E1127 16:07:40.574161 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.574216 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.574381 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.574453 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.574515 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.574575 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.574636 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.574871 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.603239 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.734147 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.734202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.734234 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.734426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.734472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.734671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.735085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.735178 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.836979 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837159 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837245 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837313 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837349 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837349 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837440 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837549 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.837568 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: I1127 16:07:40.902321 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:07:40 crc kubenswrapper[4707]: W1127 16:07:40.927642 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-ac0984c34d755fa012844545385c500de1d7193a8a07a9bb9c9399dc2ee992e6 WatchSource:0}: Error finding container ac0984c34d755fa012844545385c500de1d7193a8a07a9bb9c9399dc2ee992e6: Status 404 returned error can't find the container with id ac0984c34d755fa012844545385c500de1d7193a8a07a9bb9c9399dc2ee992e6 Nov 27 16:07:40 crc kubenswrapper[4707]: E1127 16:07:40.930730 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187be8d06d43f818 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 16:07:40.92969372 +0000 UTC m=+236.561142498,LastTimestamp:2025-11-27 16:07:40.92969372 +0000 UTC m=+236.561142498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.475504 4707 generic.go:334] "Generic (PLEG): container finished" podID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" containerID="7292858334cd71186cf1e15e24096a69d8d7e6dcc30e4202095c417d81505dae" exitCode=0 Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.475638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c43ce123-cfbd-4c97-87d9-9b7144b8443c","Type":"ContainerDied","Data":"7292858334cd71186cf1e15e24096a69d8d7e6dcc30e4202095c417d81505dae"} Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.476832 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.477398 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.481669 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.483046 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.484054 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8" exitCode=0 Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.484106 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a" exitCode=0 Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.484127 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228" exitCode=0 Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.484145 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277" exitCode=2 Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.484260 4707 scope.go:117] "RemoveContainer" containerID="4f12d0d89698cd6f702c595882dc7ed97fbc585b8a4ef3a8b3755c984a823dd9" Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.486765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"98b966fd870bd66240664fce34c15c697c153b51164eadb2e31c9caf542d9216"} Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.486805 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ac0984c34d755fa012844545385c500de1d7193a8a07a9bb9c9399dc2ee992e6"} Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.487453 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:41 crc kubenswrapper[4707]: I1127 16:07:41.487806 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.495620 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.932032 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.933556 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.933937 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.939521 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.940442 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.940855 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.941182 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:42 crc kubenswrapper[4707]: I1127 16:07:42.941500 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.106015 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.106147 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.106605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.106766 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-var-lock\") pod \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.106963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.107194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kube-api-access\") pod \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.107429 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kubelet-dir\") pod \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\" (UID: \"c43ce123-cfbd-4c97-87d9-9b7144b8443c\") " Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.106658 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.106857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-var-lock" (OuterVolumeSpecName: "var-lock") pod "c43ce123-cfbd-4c97-87d9-9b7144b8443c" (UID: "c43ce123-cfbd-4c97-87d9-9b7144b8443c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.107056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.107545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c43ce123-cfbd-4c97-87d9-9b7144b8443c" (UID: "c43ce123-cfbd-4c97-87d9-9b7144b8443c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.108707 4707 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.108866 4707 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.109054 4707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.109214 4707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c43ce123-cfbd-4c97-87d9-9b7144b8443c-var-lock\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.109332 4707 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.114952 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c43ce123-cfbd-4c97-87d9-9b7144b8443c" (UID: "c43ce123-cfbd-4c97-87d9-9b7144b8443c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.210488 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.210911 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c43ce123-cfbd-4c97-87d9-9b7144b8443c-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.505429 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.505430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c43ce123-cfbd-4c97-87d9-9b7144b8443c","Type":"ContainerDied","Data":"094a9715f936bc11fc44597e99e6b6ea5b912015c7c1fcf921b03ac0ea361860"} Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.505856 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="094a9715f936bc11fc44597e99e6b6ea5b912015c7c1fcf921b03ac0ea361860" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.508576 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.509448 4707 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce" exitCode=0 Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.509556 4707 scope.go:117] "RemoveContainer" containerID="4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.509602 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.510243 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.510679 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.510840 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.511098 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.511351 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.512143 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.512418 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.512651 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.512863 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.530507 4707 scope.go:117] "RemoveContainer" containerID="cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.545347 4707 scope.go:117] "RemoveContainer" containerID="7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.560718 4707 scope.go:117] "RemoveContainer" containerID="791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.578986 4707 scope.go:117] "RemoveContainer" containerID="4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.596614 4707 scope.go:117] "RemoveContainer" containerID="608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.632453 4707 scope.go:117] "RemoveContainer" containerID="4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8" Nov 27 16:07:43 crc kubenswrapper[4707]: E1127 16:07:43.633059 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\": container with ID starting with 4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8 not found: ID does not exist" containerID="4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.633146 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8"} err="failed to get container status \"4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\": rpc error: code = NotFound desc = could not find container \"4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8\": container with ID starting with 4a47eb0592819f0bc514399e163c88e2c63111aa0ec4a28f00f809ce1bdf2aa8 not found: ID does not exist" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.633218 4707 scope.go:117] "RemoveContainer" containerID="cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a" Nov 27 16:07:43 crc kubenswrapper[4707]: E1127 16:07:43.633691 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\": container with ID starting with cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a not found: ID does not exist" containerID="cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.633728 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a"} err="failed to get container status \"cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\": rpc error: code = NotFound desc = could not find container \"cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a\": container with ID starting with cf16f5508845d69184573b51f3052f79123556667cbd3ddf4b2adfb135312a3a not found: ID does not exist" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.633751 4707 scope.go:117] "RemoveContainer" containerID="7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228" Nov 27 16:07:43 crc kubenswrapper[4707]: E1127 16:07:43.634010 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\": container with ID starting with 7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228 not found: ID does not exist" containerID="7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.634038 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228"} err="failed to get container status \"7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\": rpc error: code = NotFound desc = could not find container \"7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228\": container with ID starting with 7412dcdb5a249551f558a01a8fecef5f873e5e0a16e917d5cbc1ca74a917f228 not found: ID does not exist" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.634058 4707 scope.go:117] "RemoveContainer" containerID="791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277" Nov 27 16:07:43 crc kubenswrapper[4707]: E1127 16:07:43.634387 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\": container with ID starting with 791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277 not found: ID does not exist" containerID="791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.634416 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277"} err="failed to get container status \"791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\": rpc error: code = NotFound desc = could not find container \"791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277\": container with ID starting with 791f332aec1698d02da5ec585ad2fbfcb77529f24abc0e3a77581be38d7ce277 not found: ID does not exist" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.634457 4707 scope.go:117] "RemoveContainer" containerID="4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce" Nov 27 16:07:43 crc kubenswrapper[4707]: E1127 16:07:43.634766 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\": container with ID starting with 4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce not found: ID does not exist" containerID="4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.634802 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce"} err="failed to get container status \"4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\": rpc error: code = NotFound desc = could not find container \"4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce\": container with ID starting with 4a25770e0e3eebd7e807f3775101a1cd5d47073e4a86850da475fecd4a9c88ce not found: ID does not exist" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.634826 4707 scope.go:117] "RemoveContainer" containerID="608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5" Nov 27 16:07:43 crc kubenswrapper[4707]: E1127 16:07:43.635081 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\": container with ID starting with 608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5 not found: ID does not exist" containerID="608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5" Nov 27 16:07:43 crc kubenswrapper[4707]: I1127 16:07:43.635113 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5"} err="failed to get container status \"608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\": rpc error: code = NotFound desc = could not find container \"608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5\": container with ID starting with 608ad63f8f5352dad650253df603590a1cc20bc05d86dd561d659295264ab4b5 not found: ID does not exist" Nov 27 16:07:45 crc kubenswrapper[4707]: I1127 16:07:45.198003 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:45 crc kubenswrapper[4707]: I1127 16:07:45.198675 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:45 crc kubenswrapper[4707]: I1127 16:07:45.199162 4707 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:46 crc kubenswrapper[4707]: E1127 16:07:46.370211 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187be8d06d43f818 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 16:07:40.92969372 +0000 UTC m=+236.561142498,LastTimestamp:2025-11-27 16:07:40.92969372 +0000 UTC m=+236.561142498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 16:07:46 crc kubenswrapper[4707]: E1127 16:07:46.584097 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:46 crc kubenswrapper[4707]: E1127 16:07:46.584483 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:46 crc kubenswrapper[4707]: E1127 16:07:46.584784 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:46 crc kubenswrapper[4707]: E1127 16:07:46.585195 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:46 crc kubenswrapper[4707]: E1127 16:07:46.585537 4707 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:46 crc kubenswrapper[4707]: I1127 16:07:46.585578 4707 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 27 16:07:46 crc kubenswrapper[4707]: E1127 16:07:46.585879 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Nov 27 16:07:46 crc kubenswrapper[4707]: E1127 16:07:46.787204 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Nov 27 16:07:47 crc kubenswrapper[4707]: E1127 16:07:47.188154 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Nov 27 16:07:47 crc kubenswrapper[4707]: E1127 16:07:47.989938 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Nov 27 16:07:49 crc kubenswrapper[4707]: E1127 16:07:49.590929 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="3.2s" Nov 27 16:07:52 crc kubenswrapper[4707]: E1127 16:07:52.791916 4707 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="6.4s" Nov 27 16:07:54 crc kubenswrapper[4707]: I1127 16:07:54.590544 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 27 16:07:54 crc kubenswrapper[4707]: I1127 16:07:54.590674 4707 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13" exitCode=1 Nov 27 16:07:54 crc kubenswrapper[4707]: I1127 16:07:54.590749 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13"} Nov 27 16:07:54 crc kubenswrapper[4707]: I1127 16:07:54.591461 4707 scope.go:117] "RemoveContainer" containerID="36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13" Nov 27 16:07:54 crc kubenswrapper[4707]: I1127 16:07:54.592184 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:54 crc kubenswrapper[4707]: I1127 16:07:54.592992 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:54 crc kubenswrapper[4707]: I1127 16:07:54.593499 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.194995 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.201120 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.201637 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.202031 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.202726 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.203178 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.204210 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.229742 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.229801 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:07:55 crc kubenswrapper[4707]: E1127 16:07:55.230537 4707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.231281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:55 crc kubenswrapper[4707]: W1127 16:07:55.265537 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-1c635c98e734ebb593a7e79ae2bf0e6466e715592c0fe85b57a68f3e478a8e80 WatchSource:0}: Error finding container 1c635c98e734ebb593a7e79ae2bf0e6466e715592c0fe85b57a68f3e478a8e80: Status 404 returned error can't find the container with id 1c635c98e734ebb593a7e79ae2bf0e6466e715592c0fe85b57a68f3e478a8e80 Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.600056 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1c635c98e734ebb593a7e79ae2bf0e6466e715592c0fe85b57a68f3e478a8e80"} Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.604447 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.604505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6ec467c6ba66e25d48bfa045b602452d66481eb089068b9d112c5f70fbdd3f1"} Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.605794 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.606590 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:55 crc kubenswrapper[4707]: I1127 16:07:55.607189 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:56 crc kubenswrapper[4707]: E1127 16:07:56.371893 4707 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187be8d06d43f818 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 16:07:40.92969372 +0000 UTC m=+236.561142498,LastTimestamp:2025-11-27 16:07:40.92969372 +0000 UTC m=+236.561142498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 16:07:56 crc kubenswrapper[4707]: I1127 16:07:56.615532 4707 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d726c77427616ac5cbabfd5150dd7b1579a891878d219c55887bdb08982ad41e" exitCode=0 Nov 27 16:07:56 crc kubenswrapper[4707]: I1127 16:07:56.615604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d726c77427616ac5cbabfd5150dd7b1579a891878d219c55887bdb08982ad41e"} Nov 27 16:07:56 crc kubenswrapper[4707]: I1127 16:07:56.616031 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:07:56 crc kubenswrapper[4707]: I1127 16:07:56.616054 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:07:56 crc kubenswrapper[4707]: I1127 16:07:56.616663 4707 status_manager.go:851] "Failed to get status for pod" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:56 crc kubenswrapper[4707]: E1127 16:07:56.616787 4707 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:56 crc kubenswrapper[4707]: I1127 16:07:56.617226 4707 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:56 crc kubenswrapper[4707]: I1127 16:07:56.617672 4707 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 27 16:07:57 crc kubenswrapper[4707]: I1127 16:07:57.143726 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:07:57 crc kubenswrapper[4707]: I1127 16:07:57.625528 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"756ec80209f6bfbe573f24c0b2eada46d900d10f8e81cb5191b86768048efdd4"} Nov 27 16:07:57 crc kubenswrapper[4707]: I1127 16:07:57.625802 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cdac3a8481df8ada3c53d9b408d415541cabc0e15f73815961c6feaf8be4a904"} Nov 27 16:07:57 crc kubenswrapper[4707]: I1127 16:07:57.625813 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"53a4a64d17cae7c1660034e196235a3c83b28014bb9ce8720c341d60a2eb1fd1"} Nov 27 16:07:58 crc kubenswrapper[4707]: I1127 16:07:58.637305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d704b5ff3507e405951c5cc6e7c1bbb41f9ea13a3b9a5df518e91289b6fe4c2"} Nov 27 16:07:58 crc kubenswrapper[4707]: I1127 16:07:58.637731 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:07:58 crc kubenswrapper[4707]: I1127 16:07:58.637765 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:07:58 crc kubenswrapper[4707]: I1127 16:07:58.637745 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:07:58 crc kubenswrapper[4707]: I1127 16:07:58.637859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"74a136a26fc93286e32de9ed62607cc6238a6a4db5c88091f6a3f8b2364b2ed6"} Nov 27 16:08:00 crc kubenswrapper[4707]: I1127 16:08:00.232531 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:08:00 crc kubenswrapper[4707]: I1127 16:08:00.232597 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:08:00 crc kubenswrapper[4707]: I1127 16:08:00.238005 4707 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]log ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]etcd ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-api-request-count-filter ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-startkubeinformers ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/generic-apiserver-start-informers ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/priority-and-fairness-config-consumer ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/priority-and-fairness-filter ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-apiextensions-informers ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-apiextensions-controllers ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/crd-informer-synced ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-system-namespaces-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-cluster-authentication-info-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-legacy-token-tracking-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-service-ip-repair-controllers ok Nov 27 16:08:00 crc kubenswrapper[4707]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/priority-and-fairness-config-producer ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/bootstrap-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/start-kube-aggregator-informers ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/apiservice-status-local-available-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/apiservice-status-remote-available-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/apiservice-registration-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/apiservice-wait-for-first-sync ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/apiservice-discovery-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/kube-apiserver-autoregistration ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]autoregister-completion ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/apiservice-openapi-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: [+]poststarthook/apiservice-openapiv3-controller ok Nov 27 16:08:00 crc kubenswrapper[4707]: livez check failed Nov 27 16:08:00 crc kubenswrapper[4707]: I1127 16:08:00.238504 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:08:02 crc kubenswrapper[4707]: I1127 16:08:02.509593 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:08:02 crc kubenswrapper[4707]: I1127 16:08:02.509820 4707 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 27 16:08:02 crc kubenswrapper[4707]: I1127 16:08:02.510738 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 27 16:08:04 crc kubenswrapper[4707]: I1127 16:08:04.002364 4707 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:08:04 crc kubenswrapper[4707]: I1127 16:08:04.680230 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:08:04 crc kubenswrapper[4707]: I1127 16:08:04.680981 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:08:05 crc kubenswrapper[4707]: I1127 16:08:05.238840 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="16cce2a5-8edf-45dd-a296-bb63b013d7fe" Nov 27 16:08:05 crc kubenswrapper[4707]: I1127 16:08:05.241464 4707 status_manager.go:370] "Container startup changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://53a4a64d17cae7c1660034e196235a3c83b28014bb9ce8720c341d60a2eb1fd1" Nov 27 16:08:05 crc kubenswrapper[4707]: I1127 16:08:05.241521 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:08:05 crc kubenswrapper[4707]: I1127 16:08:05.686788 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:08:05 crc kubenswrapper[4707]: I1127 16:08:05.686835 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:08:05 crc kubenswrapper[4707]: I1127 16:08:05.690929 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="16cce2a5-8edf-45dd-a296-bb63b013d7fe" Nov 27 16:08:05 crc kubenswrapper[4707]: I1127 16:08:05.694238 4707 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://53a4a64d17cae7c1660034e196235a3c83b28014bb9ce8720c341d60a2eb1fd1" Nov 27 16:08:05 crc kubenswrapper[4707]: I1127 16:08:05.694270 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:08:06 crc kubenswrapper[4707]: I1127 16:08:06.694756 4707 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:08:06 crc kubenswrapper[4707]: I1127 16:08:06.694806 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f300806f-b97e-4453-b011-19442ca1240d" Nov 27 16:08:06 crc kubenswrapper[4707]: I1127 16:08:06.702604 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="16cce2a5-8edf-45dd-a296-bb63b013d7fe" Nov 27 16:08:12 crc kubenswrapper[4707]: I1127 16:08:12.510677 4707 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 27 16:08:12 crc kubenswrapper[4707]: I1127 16:08:12.511473 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 27 16:08:14 crc kubenswrapper[4707]: I1127 16:08:14.444859 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 27 16:08:14 crc kubenswrapper[4707]: I1127 16:08:14.497092 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 27 16:08:14 crc kubenswrapper[4707]: I1127 16:08:14.971271 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 27 16:08:15 crc kubenswrapper[4707]: I1127 16:08:15.263995 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 27 16:08:15 crc kubenswrapper[4707]: I1127 16:08:15.719760 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 27 16:08:15 crc kubenswrapper[4707]: I1127 16:08:15.743651 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 27 16:08:15 crc kubenswrapper[4707]: I1127 16:08:15.821343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 27 16:08:15 crc kubenswrapper[4707]: I1127 16:08:15.840014 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 27 16:08:15 crc kubenswrapper[4707]: I1127 16:08:15.846270 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.144653 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.173986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.179101 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.184105 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.238482 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.501870 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.676358 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.720117 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.935888 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 16:08:16 crc kubenswrapper[4707]: I1127 16:08:16.999214 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.197608 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.307373 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.353100 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.391766 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.592124 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.619650 4707 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.674591 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.765192 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.781581 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.797334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.937781 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 27 16:08:17 crc kubenswrapper[4707]: I1127 16:08:17.975197 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.042457 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.078724 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.080189 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.095935 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.136063 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.228801 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.326960 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.433918 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.608638 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.658825 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.783105 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 27 16:08:18 crc kubenswrapper[4707]: I1127 16:08:18.906522 4707 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.009609 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.013213 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.081528 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.188823 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.256667 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.310390 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.317388 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.373440 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.376565 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.427485 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.430915 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.493248 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.587046 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.621597 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.673802 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.744189 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.774698 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.780827 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.829729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.866488 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.884924 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 27 16:08:19 crc kubenswrapper[4707]: I1127 16:08:19.998511 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.005057 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.037883 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.104141 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.122531 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.150477 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.164538 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.174741 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.198194 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.233499 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.246317 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.272405 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.293481 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.313342 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.340980 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.354151 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.364927 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.365953 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.457711 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.460273 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.491027 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.520621 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.557740 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.575419 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.608044 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.650944 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.771230 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.802256 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.874710 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.917857 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.963037 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.978322 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 27 16:08:20 crc kubenswrapper[4707]: I1127 16:08:20.998145 4707 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.087340 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.093193 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.095280 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.098047 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.220461 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.272219 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.328332 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.329840 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.370953 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.404765 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.521446 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.704283 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.729140 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.761146 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.773745 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.825921 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.837919 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.897101 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 27 16:08:21 crc kubenswrapper[4707]: I1127 16:08:21.988155 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.149992 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.173455 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.425650 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.438136 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.443625 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.477035 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.510280 4707 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.510337 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.510456 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.511486 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b6ec467c6ba66e25d48bfa045b602452d66481eb089068b9d112c5f70fbdd3f1"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.511802 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://b6ec467c6ba66e25d48bfa045b602452d66481eb089068b9d112c5f70fbdd3f1" gracePeriod=30 Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.519080 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.548183 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.590988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.712702 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.758154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.807224 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.809012 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.852302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.948650 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.969653 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 27 16:08:22 crc kubenswrapper[4707]: I1127 16:08:22.972018 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.029317 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.034735 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.053701 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.182804 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.271853 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.320480 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.380996 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.403489 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.583956 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.601116 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.910246 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 16:08:23 crc kubenswrapper[4707]: I1127 16:08:23.950197 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.016360 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.032683 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.038410 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.086233 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.099748 4707 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.299404 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.316315 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.355271 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.362364 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.414940 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.418761 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.439464 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.459876 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.477812 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.495699 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.527327 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.529557 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.584436 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.584925 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.600047 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.713529 4707 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.714823 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.714804309 podStartE2EDuration="44.714804309s" podCreationTimestamp="2025-11-27 16:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:08:04.043204699 +0000 UTC m=+259.674653467" watchObservedRunningTime="2025-11-27 16:08:24.714804309 +0000 UTC m=+280.346253067" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.717807 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.717845 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.722851 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.732546 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.758024 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.758007826 podStartE2EDuration="20.758007826s" podCreationTimestamp="2025-11-27 16:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:08:24.737997772 +0000 UTC m=+280.369446560" watchObservedRunningTime="2025-11-27 16:08:24.758007826 +0000 UTC m=+280.389456594" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.784187 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 27 16:08:24 crc kubenswrapper[4707]: I1127 16:08:24.934455 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.091066 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.098088 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.109970 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.110618 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.139181 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.169812 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.280668 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.337316 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.399837 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.447211 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.468363 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.476628 4707 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.477066 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://98b966fd870bd66240664fce34c15c697c153b51164eadb2e31c9caf542d9216" gracePeriod=5 Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.481180 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.744604 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.804622 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.821892 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.851781 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.870355 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.873152 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.954674 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 27 16:08:25 crc kubenswrapper[4707]: I1127 16:08:25.977493 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.012935 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.092909 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.154036 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.191206 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.207602 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.294280 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.341188 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.361013 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.428010 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.554439 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.605373 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.669934 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.670010 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.822994 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 27 16:08:26 crc kubenswrapper[4707]: I1127 16:08:26.835302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.088557 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.142401 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.163692 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.201665 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.257893 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.309249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.342856 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.519445 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.609088 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.751118 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.762006 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.775815 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.868424 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.885770 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 27 16:08:27 crc kubenswrapper[4707]: I1127 16:08:27.946000 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.165008 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.203506 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.221988 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.252829 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.280997 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.597896 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.693141 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.780678 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.792900 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.835338 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 27 16:08:28 crc kubenswrapper[4707]: I1127 16:08:28.887960 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 27 16:08:29 crc kubenswrapper[4707]: I1127 16:08:29.019774 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 27 16:08:29 crc kubenswrapper[4707]: I1127 16:08:29.023474 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 27 16:08:29 crc kubenswrapper[4707]: I1127 16:08:29.223952 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 27 16:08:29 crc kubenswrapper[4707]: I1127 16:08:29.249198 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 27 16:08:29 crc kubenswrapper[4707]: I1127 16:08:29.392015 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 27 16:08:29 crc kubenswrapper[4707]: I1127 16:08:29.401675 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 27 16:08:29 crc kubenswrapper[4707]: I1127 16:08:29.551575 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 27 16:08:29 crc kubenswrapper[4707]: I1127 16:08:29.574800 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 27 16:08:30 crc kubenswrapper[4707]: I1127 16:08:30.136101 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 16:08:30 crc kubenswrapper[4707]: I1127 16:08:30.142765 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 27 16:08:30 crc kubenswrapper[4707]: I1127 16:08:30.451544 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 27 16:08:30 crc kubenswrapper[4707]: I1127 16:08:30.857540 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 27 16:08:30 crc kubenswrapper[4707]: I1127 16:08:30.857643 4707 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="98b966fd870bd66240664fce34c15c697c153b51164eadb2e31c9caf542d9216" exitCode=137 Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.114775 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.114874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.203222 4707 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.222059 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.222098 4707 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="bbde5ea7-1ce2-4c84-bf9b-612c67a6afcc" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.225308 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.225347 4707 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="bbde5ea7-1ce2-4c84-bf9b-612c67a6afcc" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.226449 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.226507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.226595 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.226633 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.226662 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.226697 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.226807 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.226941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.227058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.227252 4707 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.227285 4707 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.227303 4707 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.227322 4707 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.237881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.329151 4707 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.868732 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.868841 4707 scope.go:117] "RemoveContainer" containerID="98b966fd870bd66240664fce34c15c697c153b51164eadb2e31c9caf542d9216" Nov 27 16:08:31 crc kubenswrapper[4707]: I1127 16:08:31.868926 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 16:08:33 crc kubenswrapper[4707]: I1127 16:08:33.205620 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 27 16:08:38 crc kubenswrapper[4707]: I1127 16:08:38.968432 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d97n2"] Nov 27 16:08:38 crc kubenswrapper[4707]: I1127 16:08:38.968991 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d97n2" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerName="registry-server" containerID="cri-o://a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93" gracePeriod=30 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.008881 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dnk6z"] Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.009305 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dnk6z" podUID="bafb40a6-a701-4082-a791-65fce73e8669" containerName="registry-server" containerID="cri-o://5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491" gracePeriod=30 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.025121 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jtqhs"] Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.030805 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhshk"] Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.034088 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xhshk" podUID="2b313184-19c4-42e4-b488-63f8e894feea" containerName="registry-server" containerID="cri-o://ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e" gracePeriod=30 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.037514 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s5nbn"] Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.037800 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s5nbn" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerName="registry-server" containerID="cri-o://ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480" gracePeriod=30 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.039552 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" podUID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" containerName="marketplace-operator" containerID="cri-o://3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc" gracePeriod=30 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.408297 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.435418 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.443833 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.447786 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.454119 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545244 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-catalog-content\") pod \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545299 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-utilities\") pod \"2b313184-19c4-42e4-b488-63f8e894feea\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545339 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw6s5\" (UniqueName: \"kubernetes.io/projected/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-kube-api-access-vw6s5\") pod \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545388 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-catalog-content\") pod \"2b313184-19c4-42e4-b488-63f8e894feea\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lz5k\" (UniqueName: \"kubernetes.io/projected/fb266fde-0b2c-4866-8274-9ed2d4821c14-kube-api-access-6lz5k\") pod \"fb266fde-0b2c-4866-8274-9ed2d4821c14\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545495 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-utilities\") pod \"bafb40a6-a701-4082-a791-65fce73e8669\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545523 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-utilities\") pod \"fb266fde-0b2c-4866-8274-9ed2d4821c14\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545546 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv7fb\" (UniqueName: \"kubernetes.io/projected/2b313184-19c4-42e4-b488-63f8e894feea-kube-api-access-vv7fb\") pod \"2b313184-19c4-42e4-b488-63f8e894feea\" (UID: \"2b313184-19c4-42e4-b488-63f8e894feea\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vm85\" (UniqueName: \"kubernetes.io/projected/bafb40a6-a701-4082-a791-65fce73e8669-kube-api-access-5vm85\") pod \"bafb40a6-a701-4082-a791-65fce73e8669\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545600 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-catalog-content\") pod \"bafb40a6-a701-4082-a791-65fce73e8669\" (UID: \"bafb40a6-a701-4082-a791-65fce73e8669\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-utilities\") pod \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545653 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-catalog-content\") pod \"fb266fde-0b2c-4866-8274-9ed2d4821c14\" (UID: \"fb266fde-0b2c-4866-8274-9ed2d4821c14\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-trusted-ca\") pod \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545720 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-operator-metrics\") pod \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\" (UID: \"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.545744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx5bp\" (UniqueName: \"kubernetes.io/projected/f0d7e53b-3b6f-469d-b550-e61dbea7724f-kube-api-access-wx5bp\") pod \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\" (UID: \"f0d7e53b-3b6f-469d-b550-e61dbea7724f\") " Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.546269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-utilities" (OuterVolumeSpecName: "utilities") pod "bafb40a6-a701-4082-a791-65fce73e8669" (UID: "bafb40a6-a701-4082-a791-65fce73e8669"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.546776 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-utilities" (OuterVolumeSpecName: "utilities") pod "2b313184-19c4-42e4-b488-63f8e894feea" (UID: "2b313184-19c4-42e4-b488-63f8e894feea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.548360 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-utilities" (OuterVolumeSpecName: "utilities") pod "f0d7e53b-3b6f-469d-b550-e61dbea7724f" (UID: "f0d7e53b-3b6f-469d-b550-e61dbea7724f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.549262 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-utilities" (OuterVolumeSpecName: "utilities") pod "fb266fde-0b2c-4866-8274-9ed2d4821c14" (UID: "fb266fde-0b2c-4866-8274-9ed2d4821c14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.550249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" (UID: "d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.552225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b313184-19c4-42e4-b488-63f8e894feea-kube-api-access-vv7fb" (OuterVolumeSpecName: "kube-api-access-vv7fb") pod "2b313184-19c4-42e4-b488-63f8e894feea" (UID: "2b313184-19c4-42e4-b488-63f8e894feea"). InnerVolumeSpecName "kube-api-access-vv7fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.553571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafb40a6-a701-4082-a791-65fce73e8669-kube-api-access-5vm85" (OuterVolumeSpecName: "kube-api-access-5vm85") pod "bafb40a6-a701-4082-a791-65fce73e8669" (UID: "bafb40a6-a701-4082-a791-65fce73e8669"). InnerVolumeSpecName "kube-api-access-5vm85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.554235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" (UID: "d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.563138 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb266fde-0b2c-4866-8274-9ed2d4821c14-kube-api-access-6lz5k" (OuterVolumeSpecName: "kube-api-access-6lz5k") pod "fb266fde-0b2c-4866-8274-9ed2d4821c14" (UID: "fb266fde-0b2c-4866-8274-9ed2d4821c14"). InnerVolumeSpecName "kube-api-access-6lz5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.563216 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d7e53b-3b6f-469d-b550-e61dbea7724f-kube-api-access-wx5bp" (OuterVolumeSpecName: "kube-api-access-wx5bp") pod "f0d7e53b-3b6f-469d-b550-e61dbea7724f" (UID: "f0d7e53b-3b6f-469d-b550-e61dbea7724f"). InnerVolumeSpecName "kube-api-access-wx5bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.566526 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-kube-api-access-vw6s5" (OuterVolumeSpecName: "kube-api-access-vw6s5") pod "d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" (UID: "d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f"). InnerVolumeSpecName "kube-api-access-vw6s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.585884 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b313184-19c4-42e4-b488-63f8e894feea" (UID: "2b313184-19c4-42e4-b488-63f8e894feea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.599859 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bafb40a6-a701-4082-a791-65fce73e8669" (UID: "bafb40a6-a701-4082-a791-65fce73e8669"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.621969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0d7e53b-3b6f-469d-b550-e61dbea7724f" (UID: "f0d7e53b-3b6f-469d-b550-e61dbea7724f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647208 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647263 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647281 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv7fb\" (UniqueName: \"kubernetes.io/projected/2b313184-19c4-42e4-b488-63f8e894feea-kube-api-access-vv7fb\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647297 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vm85\" (UniqueName: \"kubernetes.io/projected/bafb40a6-a701-4082-a791-65fce73e8669-kube-api-access-5vm85\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647519 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bafb40a6-a701-4082-a791-65fce73e8669-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647532 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647544 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647556 4707 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647570 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx5bp\" (UniqueName: \"kubernetes.io/projected/f0d7e53b-3b6f-469d-b550-e61dbea7724f-kube-api-access-wx5bp\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647580 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d7e53b-3b6f-469d-b550-e61dbea7724f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647593 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647604 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw6s5\" (UniqueName: \"kubernetes.io/projected/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f-kube-api-access-vw6s5\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647617 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b313184-19c4-42e4-b488-63f8e894feea-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.647627 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lz5k\" (UniqueName: \"kubernetes.io/projected/fb266fde-0b2c-4866-8274-9ed2d4821c14-kube-api-access-6lz5k\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.671240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb266fde-0b2c-4866-8274-9ed2d4821c14" (UID: "fb266fde-0b2c-4866-8274-9ed2d4821c14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.748891 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb266fde-0b2c-4866-8274-9ed2d4821c14-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.929084 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerID="a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93" exitCode=0 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.929169 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d97n2" event={"ID":"f0d7e53b-3b6f-469d-b550-e61dbea7724f","Type":"ContainerDied","Data":"a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.929211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d97n2" event={"ID":"f0d7e53b-3b6f-469d-b550-e61dbea7724f","Type":"ContainerDied","Data":"bcbc2755d8fde736ca79c664d9ad52162b326fb62c35335e122e25fd0c738771"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.929233 4707 scope.go:117] "RemoveContainer" containerID="a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.929244 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d97n2" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.931575 4707 generic.go:334] "Generic (PLEG): container finished" podID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" containerID="3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc" exitCode=0 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.931755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.931746 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" event={"ID":"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f","Type":"ContainerDied","Data":"3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.931830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jtqhs" event={"ID":"d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f","Type":"ContainerDied","Data":"b812b66ced3f98fc1ee09340abd229271a2068f3a2dd9a8821ceac0dbbbeb19c"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.936878 4707 generic.go:334] "Generic (PLEG): container finished" podID="2b313184-19c4-42e4-b488-63f8e894feea" containerID="ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e" exitCode=0 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.936942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhshk" event={"ID":"2b313184-19c4-42e4-b488-63f8e894feea","Type":"ContainerDied","Data":"ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.936972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhshk" event={"ID":"2b313184-19c4-42e4-b488-63f8e894feea","Type":"ContainerDied","Data":"fd1fa894e27a86ea400eeb7d951d7813dc087f76875acbcc9ce70c5d58d2a1b5"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.937047 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhshk" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.949905 4707 generic.go:334] "Generic (PLEG): container finished" podID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerID="ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480" exitCode=0 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.950344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nbn" event={"ID":"fb266fde-0b2c-4866-8274-9ed2d4821c14","Type":"ContainerDied","Data":"ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.950415 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s5nbn" event={"ID":"fb266fde-0b2c-4866-8274-9ed2d4821c14","Type":"ContainerDied","Data":"cc1ed1694a4d759da6aff6e636ec0123a60ecae0aef537d69b79140010f5067e"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.950509 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s5nbn" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.957759 4707 generic.go:334] "Generic (PLEG): container finished" podID="bafb40a6-a701-4082-a791-65fce73e8669" containerID="5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491" exitCode=0 Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.957807 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dnk6z" event={"ID":"bafb40a6-a701-4082-a791-65fce73e8669","Type":"ContainerDied","Data":"5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.957836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dnk6z" event={"ID":"bafb40a6-a701-4082-a791-65fce73e8669","Type":"ContainerDied","Data":"d128c0ff7399430c722aa19967dfb6b9eb9c9f02bb5858723944cea2a5cee014"} Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.957873 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dnk6z" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.970575 4707 scope.go:117] "RemoveContainer" containerID="172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a" Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.976103 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jtqhs"] Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.987241 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jtqhs"] Nov 27 16:08:39 crc kubenswrapper[4707]: I1127 16:08:39.992587 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhshk"] Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.001228 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhshk"] Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.006596 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d97n2"] Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.010495 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d97n2"] Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.012347 4707 scope.go:117] "RemoveContainer" containerID="29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.014231 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s5nbn"] Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.019260 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s5nbn"] Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.023982 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dnk6z"] Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.030105 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dnk6z"] Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.039927 4707 scope.go:117] "RemoveContainer" containerID="a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.041431 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93\": container with ID starting with a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93 not found: ID does not exist" containerID="a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.041484 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93"} err="failed to get container status \"a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93\": rpc error: code = NotFound desc = could not find container \"a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93\": container with ID starting with a7892fdd0c003ba7d185cd20616c3a0312eac77264613069952d1c8311468f93 not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.041517 4707 scope.go:117] "RemoveContainer" containerID="172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.042058 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a\": container with ID starting with 172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a not found: ID does not exist" containerID="172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.042099 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a"} err="failed to get container status \"172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a\": rpc error: code = NotFound desc = could not find container \"172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a\": container with ID starting with 172950d4544e72c28098b45d2c5a118a6a136bc5e3c1346cf09ce6f618212f2a not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.042128 4707 scope.go:117] "RemoveContainer" containerID="29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.042625 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829\": container with ID starting with 29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829 not found: ID does not exist" containerID="29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.042704 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829"} err="failed to get container status \"29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829\": rpc error: code = NotFound desc = could not find container \"29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829\": container with ID starting with 29bcbd99c9e0940a37b30fa6eae894f0c121373597e19291f9abb1b10d814829 not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.042797 4707 scope.go:117] "RemoveContainer" containerID="3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.057226 4707 scope.go:117] "RemoveContainer" containerID="3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.057599 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc\": container with ID starting with 3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc not found: ID does not exist" containerID="3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.057684 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc"} err="failed to get container status \"3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc\": rpc error: code = NotFound desc = could not find container \"3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc\": container with ID starting with 3cd8b5f527947be370d0566254e890545d846bef58ddad0e5936ea65f25417bc not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.057767 4707 scope.go:117] "RemoveContainer" containerID="ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.076964 4707 scope.go:117] "RemoveContainer" containerID="75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.090725 4707 scope.go:117] "RemoveContainer" containerID="5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.105764 4707 scope.go:117] "RemoveContainer" containerID="ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.107637 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e\": container with ID starting with ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e not found: ID does not exist" containerID="ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.107695 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e"} err="failed to get container status \"ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e\": rpc error: code = NotFound desc = could not find container \"ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e\": container with ID starting with ab899d6f4e06dbc7cee2fde1e92d57be3549ff6c1d4053bd8a03a25a9cb2750e not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.107734 4707 scope.go:117] "RemoveContainer" containerID="75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.107969 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e\": container with ID starting with 75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e not found: ID does not exist" containerID="75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.108076 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e"} err="failed to get container status \"75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e\": rpc error: code = NotFound desc = could not find container \"75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e\": container with ID starting with 75857b3ef4ac137564d3343fc8267f35f20bd1a9ab8431393d27176a0239467e not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.108146 4707 scope.go:117] "RemoveContainer" containerID="5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.108543 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252\": container with ID starting with 5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252 not found: ID does not exist" containerID="5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.108582 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252"} err="failed to get container status \"5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252\": rpc error: code = NotFound desc = could not find container \"5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252\": container with ID starting with 5e67c9bfeda69e9fc00ddd75f3965a7d4d42d4bc2465a853e186a115aa32d252 not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.108610 4707 scope.go:117] "RemoveContainer" containerID="ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.121469 4707 scope.go:117] "RemoveContainer" containerID="b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.135574 4707 scope.go:117] "RemoveContainer" containerID="5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.200099 4707 scope.go:117] "RemoveContainer" containerID="ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.200936 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480\": container with ID starting with ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480 not found: ID does not exist" containerID="ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.200978 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480"} err="failed to get container status \"ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480\": rpc error: code = NotFound desc = could not find container \"ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480\": container with ID starting with ca005959ca3389d12f7fe2ba04ba36c258787a1cc702ced6cf3a43daf0195480 not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.201005 4707 scope.go:117] "RemoveContainer" containerID="b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.201778 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1\": container with ID starting with b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1 not found: ID does not exist" containerID="b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.201832 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1"} err="failed to get container status \"b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1\": rpc error: code = NotFound desc = could not find container \"b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1\": container with ID starting with b956a09e7444a0cea00c6b60dc9f657f225f01c55c417a871c44a349114d24c1 not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.201870 4707 scope.go:117] "RemoveContainer" containerID="5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.202223 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de\": container with ID starting with 5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de not found: ID does not exist" containerID="5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.202251 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de"} err="failed to get container status \"5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de\": rpc error: code = NotFound desc = could not find container \"5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de\": container with ID starting with 5ee77d0c1f89c66e0ab40d12c85589400b9c8752e161144d60f3c9641625f8de not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.202264 4707 scope.go:117] "RemoveContainer" containerID="5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.220248 4707 scope.go:117] "RemoveContainer" containerID="8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.237138 4707 scope.go:117] "RemoveContainer" containerID="ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.257892 4707 scope.go:117] "RemoveContainer" containerID="5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.258382 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491\": container with ID starting with 5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491 not found: ID does not exist" containerID="5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.258492 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491"} err="failed to get container status \"5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491\": rpc error: code = NotFound desc = could not find container \"5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491\": container with ID starting with 5bb901eeac2a9f8650b26a3b5e6aab5057373bc5ba0cce0a32c4d9cc66c1c491 not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.258623 4707 scope.go:117] "RemoveContainer" containerID="8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.259095 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba\": container with ID starting with 8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba not found: ID does not exist" containerID="8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.259136 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba"} err="failed to get container status \"8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba\": rpc error: code = NotFound desc = could not find container \"8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba\": container with ID starting with 8cec4971c53347b57e9b9eed3670ad76e33e8157afc3c80c723c4218e7c64cba not found: ID does not exist" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.259165 4707 scope.go:117] "RemoveContainer" containerID="ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21" Nov 27 16:08:40 crc kubenswrapper[4707]: E1127 16:08:40.259839 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21\": container with ID starting with ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21 not found: ID does not exist" containerID="ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21" Nov 27 16:08:40 crc kubenswrapper[4707]: I1127 16:08:40.259896 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21"} err="failed to get container status \"ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21\": rpc error: code = NotFound desc = could not find container \"ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21\": container with ID starting with ba9f98f6b84df7f95cb72ef86b44e6a6c08801f9ac5dc5bfcd517529fbd7fc21 not found: ID does not exist" Nov 27 16:08:41 crc kubenswrapper[4707]: I1127 16:08:41.206333 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b313184-19c4-42e4-b488-63f8e894feea" path="/var/lib/kubelet/pods/2b313184-19c4-42e4-b488-63f8e894feea/volumes" Nov 27 16:08:41 crc kubenswrapper[4707]: I1127 16:08:41.209743 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafb40a6-a701-4082-a791-65fce73e8669" path="/var/lib/kubelet/pods/bafb40a6-a701-4082-a791-65fce73e8669/volumes" Nov 27 16:08:41 crc kubenswrapper[4707]: I1127 16:08:41.211202 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" path="/var/lib/kubelet/pods/d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f/volumes" Nov 27 16:08:41 crc kubenswrapper[4707]: I1127 16:08:41.213242 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" path="/var/lib/kubelet/pods/f0d7e53b-3b6f-469d-b550-e61dbea7724f/volumes" Nov 27 16:08:41 crc kubenswrapper[4707]: I1127 16:08:41.214646 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" path="/var/lib/kubelet/pods/fb266fde-0b2c-4866-8274-9ed2d4821c14/volumes" Nov 27 16:08:43 crc kubenswrapper[4707]: I1127 16:08:43.013977 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.887813 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k5pzx"] Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888255 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafb40a6-a701-4082-a791-65fce73e8669" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888267 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafb40a6-a701-4082-a791-65fce73e8669" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888280 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888286 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888294 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerName="extract-content" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888300 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerName="extract-content" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888312 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerName="extract-utilities" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888317 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerName="extract-utilities" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888326 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888333 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888339 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b313184-19c4-42e4-b488-63f8e894feea" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888344 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b313184-19c4-42e4-b488-63f8e894feea" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888350 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" containerName="marketplace-operator" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888356 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" containerName="marketplace-operator" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888421 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b313184-19c4-42e4-b488-63f8e894feea" containerName="extract-utilities" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888434 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b313184-19c4-42e4-b488-63f8e894feea" containerName="extract-utilities" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888448 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerName="extract-content" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888456 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerName="extract-content" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888466 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerName="extract-utilities" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888473 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerName="extract-utilities" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888485 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888495 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888504 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafb40a6-a701-4082-a791-65fce73e8669" containerName="extract-content" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888512 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafb40a6-a701-4082-a791-65fce73e8669" containerName="extract-content" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888519 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b313184-19c4-42e4-b488-63f8e894feea" containerName="extract-content" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b313184-19c4-42e4-b488-63f8e894feea" containerName="extract-content" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888534 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafb40a6-a701-4082-a791-65fce73e8669" containerName="extract-utilities" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888541 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafb40a6-a701-4082-a791-65fce73e8669" containerName="extract-utilities" Nov 27 16:08:45 crc kubenswrapper[4707]: E1127 16:08:45.888549 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" containerName="installer" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888556 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" containerName="installer" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888659 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b313184-19c4-42e4-b488-63f8e894feea" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888668 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c43ce123-cfbd-4c97-87d9-9b7144b8443c" containerName="installer" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888677 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888687 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafb40a6-a701-4082-a791-65fce73e8669" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888699 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ba519a-a9a8-4d1a-96c1-e66ee87d5c8f" containerName="marketplace-operator" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888707 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d7e53b-3b6f-469d-b550-e61dbea7724f" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.888714 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb266fde-0b2c-4866-8274-9ed2d4821c14" containerName="registry-server" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.889058 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.892395 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.892425 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.893912 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.894733 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.901458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k5pzx"] Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.902484 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.934627 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jk2l\" (UniqueName: \"kubernetes.io/projected/d8fb3604-08bd-4ad8-9838-8275101534c7-kube-api-access-6jk2l\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.934771 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8fb3604-08bd-4ad8-9838-8275101534c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:45 crc kubenswrapper[4707]: I1127 16:08:45.934924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8fb3604-08bd-4ad8-9838-8275101534c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:46 crc kubenswrapper[4707]: I1127 16:08:46.035841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8fb3604-08bd-4ad8-9838-8275101534c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:46 crc kubenswrapper[4707]: I1127 16:08:46.035906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8fb3604-08bd-4ad8-9838-8275101534c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:46 crc kubenswrapper[4707]: I1127 16:08:46.035934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jk2l\" (UniqueName: \"kubernetes.io/projected/d8fb3604-08bd-4ad8-9838-8275101534c7-kube-api-access-6jk2l\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:46 crc kubenswrapper[4707]: I1127 16:08:46.037131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8fb3604-08bd-4ad8-9838-8275101534c7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:46 crc kubenswrapper[4707]: I1127 16:08:46.052075 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8fb3604-08bd-4ad8-9838-8275101534c7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:46 crc kubenswrapper[4707]: I1127 16:08:46.052617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jk2l\" (UniqueName: \"kubernetes.io/projected/d8fb3604-08bd-4ad8-9838-8275101534c7-kube-api-access-6jk2l\") pod \"marketplace-operator-79b997595-k5pzx\" (UID: \"d8fb3604-08bd-4ad8-9838-8275101534c7\") " pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:46 crc kubenswrapper[4707]: I1127 16:08:46.203966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:46 crc kubenswrapper[4707]: I1127 16:08:46.685513 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k5pzx"] Nov 27 16:08:47 crc kubenswrapper[4707]: I1127 16:08:47.001160 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 27 16:08:47 crc kubenswrapper[4707]: I1127 16:08:47.011826 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" event={"ID":"d8fb3604-08bd-4ad8-9838-8275101534c7","Type":"ContainerStarted","Data":"bfd7598e2c2384f4c3eed95f985fca2ca0ecc92378aa4a4f84055bda0b371faa"} Nov 27 16:08:47 crc kubenswrapper[4707]: I1127 16:08:47.011882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" event={"ID":"d8fb3604-08bd-4ad8-9838-8275101534c7","Type":"ContainerStarted","Data":"563d1f4c2a9869fb7db0068faa4894e473cefffd1903d46daa9741ec3fd39256"} Nov 27 16:08:47 crc kubenswrapper[4707]: I1127 16:08:47.012722 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:47 crc kubenswrapper[4707]: I1127 16:08:47.013559 4707 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k5pzx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Nov 27 16:08:47 crc kubenswrapper[4707]: I1127 16:08:47.013605 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" podUID="d8fb3604-08bd-4ad8-9838-8275101534c7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Nov 27 16:08:47 crc kubenswrapper[4707]: I1127 16:08:47.026889 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" podStartSLOduration=2.026870123 podStartE2EDuration="2.026870123s" podCreationTimestamp="2025-11-27 16:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:08:47.026140105 +0000 UTC m=+302.657588873" watchObservedRunningTime="2025-11-27 16:08:47.026870123 +0000 UTC m=+302.658318891" Nov 27 16:08:48 crc kubenswrapper[4707]: I1127 16:08:48.020333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k5pzx" Nov 27 16:08:52 crc kubenswrapper[4707]: I1127 16:08:52.112547 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 27 16:08:53 crc kubenswrapper[4707]: I1127 16:08:53.059401 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Nov 27 16:08:53 crc kubenswrapper[4707]: I1127 16:08:53.064700 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 27 16:08:53 crc kubenswrapper[4707]: I1127 16:08:53.064774 4707 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b6ec467c6ba66e25d48bfa045b602452d66481eb089068b9d112c5f70fbdd3f1" exitCode=137 Nov 27 16:08:53 crc kubenswrapper[4707]: I1127 16:08:53.064814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b6ec467c6ba66e25d48bfa045b602452d66481eb089068b9d112c5f70fbdd3f1"} Nov 27 16:08:53 crc kubenswrapper[4707]: I1127 16:08:53.064859 4707 scope.go:117] "RemoveContainer" containerID="36f6a5537d5031d21ea157fbcedea40756eac2c44aa1859c0cfbf3dc9a9d9a13" Nov 27 16:08:53 crc kubenswrapper[4707]: I1127 16:08:53.658532 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 27 16:08:54 crc kubenswrapper[4707]: I1127 16:08:54.073868 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Nov 27 16:08:54 crc kubenswrapper[4707]: I1127 16:08:54.077072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"54a3a71d3b9fc098e6d19a5ec5af5db7d8365c6b0abce56451b3d150434e773a"} Nov 27 16:08:54 crc kubenswrapper[4707]: I1127 16:08:54.637728 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 16:08:57 crc kubenswrapper[4707]: I1127 16:08:57.142218 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:08:58 crc kubenswrapper[4707]: I1127 16:08:58.596705 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 27 16:09:00 crc kubenswrapper[4707]: I1127 16:09:00.792760 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 27 16:09:01 crc kubenswrapper[4707]: I1127 16:09:01.725305 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 27 16:09:02 crc kubenswrapper[4707]: I1127 16:09:02.394092 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 27 16:09:02 crc kubenswrapper[4707]: I1127 16:09:02.510236 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:09:02 crc kubenswrapper[4707]: I1127 16:09:02.518766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:09:03 crc kubenswrapper[4707]: I1127 16:09:03.140182 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 16:09:06 crc kubenswrapper[4707]: I1127 16:09:06.020965 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 27 16:09:12 crc kubenswrapper[4707]: I1127 16:09:12.149898 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.370941 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l"] Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.373290 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" podUID="b43ea736-a4b5-473d-a3a2-3d779a856a86" containerName="route-controller-manager" containerID="cri-o://31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d" gracePeriod=30 Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.379481 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gg89j"] Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.380552 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" podUID="30208d57-8abd-4956-92a1-1b1aa21b754a" containerName="controller-manager" containerID="cri-o://5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df" gracePeriod=30 Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.790249 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.795193 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.803746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30208d57-8abd-4956-92a1-1b1aa21b754a-serving-cert\") pod \"30208d57-8abd-4956-92a1-1b1aa21b754a\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.803823 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ea736-a4b5-473d-a3a2-3d779a856a86-serving-cert\") pod \"b43ea736-a4b5-473d-a3a2-3d779a856a86\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.803890 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-client-ca\") pod \"b43ea736-a4b5-473d-a3a2-3d779a856a86\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.803911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-client-ca\") pod \"30208d57-8abd-4956-92a1-1b1aa21b754a\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.803955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-proxy-ca-bundles\") pod \"30208d57-8abd-4956-92a1-1b1aa21b754a\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.803981 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-config\") pod \"b43ea736-a4b5-473d-a3a2-3d779a856a86\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.804020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrzcj\" (UniqueName: \"kubernetes.io/projected/30208d57-8abd-4956-92a1-1b1aa21b754a-kube-api-access-nrzcj\") pod \"30208d57-8abd-4956-92a1-1b1aa21b754a\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.804057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-config\") pod \"30208d57-8abd-4956-92a1-1b1aa21b754a\" (UID: \"30208d57-8abd-4956-92a1-1b1aa21b754a\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.804082 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkt6m\" (UniqueName: \"kubernetes.io/projected/b43ea736-a4b5-473d-a3a2-3d779a856a86-kube-api-access-qkt6m\") pod \"b43ea736-a4b5-473d-a3a2-3d779a856a86\" (UID: \"b43ea736-a4b5-473d-a3a2-3d779a856a86\") " Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.805387 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-client-ca" (OuterVolumeSpecName: "client-ca") pod "b43ea736-a4b5-473d-a3a2-3d779a856a86" (UID: "b43ea736-a4b5-473d-a3a2-3d779a856a86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.805555 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-client-ca" (OuterVolumeSpecName: "client-ca") pod "30208d57-8abd-4956-92a1-1b1aa21b754a" (UID: "30208d57-8abd-4956-92a1-1b1aa21b754a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.805685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-config" (OuterVolumeSpecName: "config") pod "b43ea736-a4b5-473d-a3a2-3d779a856a86" (UID: "b43ea736-a4b5-473d-a3a2-3d779a856a86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.805735 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "30208d57-8abd-4956-92a1-1b1aa21b754a" (UID: "30208d57-8abd-4956-92a1-1b1aa21b754a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.805945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-config" (OuterVolumeSpecName: "config") pod "30208d57-8abd-4956-92a1-1b1aa21b754a" (UID: "30208d57-8abd-4956-92a1-1b1aa21b754a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.810264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b43ea736-a4b5-473d-a3a2-3d779a856a86-kube-api-access-qkt6m" (OuterVolumeSpecName: "kube-api-access-qkt6m") pod "b43ea736-a4b5-473d-a3a2-3d779a856a86" (UID: "b43ea736-a4b5-473d-a3a2-3d779a856a86"). InnerVolumeSpecName "kube-api-access-qkt6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.810378 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43ea736-a4b5-473d-a3a2-3d779a856a86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b43ea736-a4b5-473d-a3a2-3d779a856a86" (UID: "b43ea736-a4b5-473d-a3a2-3d779a856a86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.844469 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30208d57-8abd-4956-92a1-1b1aa21b754a-kube-api-access-nrzcj" (OuterVolumeSpecName: "kube-api-access-nrzcj") pod "30208d57-8abd-4956-92a1-1b1aa21b754a" (UID: "30208d57-8abd-4956-92a1-1b1aa21b754a"). InnerVolumeSpecName "kube-api-access-nrzcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.844619 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30208d57-8abd-4956-92a1-1b1aa21b754a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "30208d57-8abd-4956-92a1-1b1aa21b754a" (UID: "30208d57-8abd-4956-92a1-1b1aa21b754a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905594 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905635 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkt6m\" (UniqueName: \"kubernetes.io/projected/b43ea736-a4b5-473d-a3a2-3d779a856a86-kube-api-access-qkt6m\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905652 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30208d57-8abd-4956-92a1-1b1aa21b754a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905666 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b43ea736-a4b5-473d-a3a2-3d779a856a86-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905678 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905690 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905704 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b43ea736-a4b5-473d-a3a2-3d779a856a86-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905719 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/30208d57-8abd-4956-92a1-1b1aa21b754a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:22 crc kubenswrapper[4707]: I1127 16:09:22.905735 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrzcj\" (UniqueName: \"kubernetes.io/projected/30208d57-8abd-4956-92a1-1b1aa21b754a-kube-api-access-nrzcj\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.298086 4707 generic.go:334] "Generic (PLEG): container finished" podID="b43ea736-a4b5-473d-a3a2-3d779a856a86" containerID="31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d" exitCode=0 Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.298221 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.298241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" event={"ID":"b43ea736-a4b5-473d-a3a2-3d779a856a86","Type":"ContainerDied","Data":"31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d"} Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.298297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l" event={"ID":"b43ea736-a4b5-473d-a3a2-3d779a856a86","Type":"ContainerDied","Data":"d5df3df523ed7d1b1141340fee7f04d386e882c4300f8d0af10dc3d004c7e88e"} Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.298331 4707 scope.go:117] "RemoveContainer" containerID="31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.302592 4707 generic.go:334] "Generic (PLEG): container finished" podID="30208d57-8abd-4956-92a1-1b1aa21b754a" containerID="5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df" exitCode=0 Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.302651 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" event={"ID":"30208d57-8abd-4956-92a1-1b1aa21b754a","Type":"ContainerDied","Data":"5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df"} Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.302694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" event={"ID":"30208d57-8abd-4956-92a1-1b1aa21b754a","Type":"ContainerDied","Data":"941d944101a47580f4cdbac8a54a1f585ed7037f86417f59ebe45067f09ab75a"} Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.302857 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gg89j" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.332583 4707 scope.go:117] "RemoveContainer" containerID="31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d" Nov 27 16:09:23 crc kubenswrapper[4707]: E1127 16:09:23.333475 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d\": container with ID starting with 31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d not found: ID does not exist" containerID="31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.333527 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d"} err="failed to get container status \"31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d\": rpc error: code = NotFound desc = could not find container \"31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d\": container with ID starting with 31feca582c3e58d3e4a3e9c4f008b67c76ef182fd875abb92ea66e013e5a9a7d not found: ID does not exist" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.333562 4707 scope.go:117] "RemoveContainer" containerID="5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.334458 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l"] Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.338629 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4h99l"] Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.352109 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gg89j"] Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.355613 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gg89j"] Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.360003 4707 scope.go:117] "RemoveContainer" containerID="5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df" Nov 27 16:09:23 crc kubenswrapper[4707]: E1127 16:09:23.360773 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df\": container with ID starting with 5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df not found: ID does not exist" containerID="5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.360894 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df"} err="failed to get container status \"5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df\": rpc error: code = NotFound desc = could not find container \"5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df\": container with ID starting with 5684188d28916ccd6015ae03a44bfd1f8c50f19d434bb6e2d457c3766232f1df not found: ID does not exist" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.723287 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-9lcrm"] Nov 27 16:09:23 crc kubenswrapper[4707]: E1127 16:09:23.723704 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43ea736-a4b5-473d-a3a2-3d779a856a86" containerName="route-controller-manager" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.723733 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43ea736-a4b5-473d-a3a2-3d779a856a86" containerName="route-controller-manager" Nov 27 16:09:23 crc kubenswrapper[4707]: E1127 16:09:23.723761 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30208d57-8abd-4956-92a1-1b1aa21b754a" containerName="controller-manager" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.723780 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="30208d57-8abd-4956-92a1-1b1aa21b754a" containerName="controller-manager" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.723992 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43ea736-a4b5-473d-a3a2-3d779a856a86" containerName="route-controller-manager" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.724016 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="30208d57-8abd-4956-92a1-1b1aa21b754a" containerName="controller-manager" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.724670 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.730895 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.732598 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8"] Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.733978 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.734104 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.734002 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.734224 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.734292 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.736521 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.739619 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.742908 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.742965 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.742962 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.743510 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.743748 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.744777 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.746716 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-9lcrm"] Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.757542 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8"] Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817076 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh74h\" (UniqueName: \"kubernetes.io/projected/24f6a8b1-589f-4953-917f-6e21345d3c38-kube-api-access-qh74h\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817232 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/286a1cd3-29f8-4539-be07-c4366889e170-serving-cert\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-proxy-ca-bundles\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817356 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-client-ca\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gbr9\" (UniqueName: \"kubernetes.io/projected/286a1cd3-29f8-4539-be07-c4366889e170-kube-api-access-6gbr9\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-config\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817538 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-config\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-client-ca\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.817810 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f6a8b1-589f-4953-917f-6e21345d3c38-serving-cert\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.918858 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-config\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.918937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-config\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.918996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-client-ca\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.919030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f6a8b1-589f-4953-917f-6e21345d3c38-serving-cert\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.919065 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh74h\" (UniqueName: \"kubernetes.io/projected/24f6a8b1-589f-4953-917f-6e21345d3c38-kube-api-access-qh74h\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.919119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/286a1cd3-29f8-4539-be07-c4366889e170-serving-cert\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.919147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-proxy-ca-bundles\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.919197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-client-ca\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.919235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gbr9\" (UniqueName: \"kubernetes.io/projected/286a1cd3-29f8-4539-be07-c4366889e170-kube-api-access-6gbr9\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.921222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-client-ca\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.921495 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-config\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.921980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-client-ca\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.923874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-config\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.926620 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f6a8b1-589f-4953-917f-6e21345d3c38-serving-cert\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.927212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/286a1cd3-29f8-4539-be07-c4366889e170-serving-cert\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.928753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-proxy-ca-bundles\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.950272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh74h\" (UniqueName: \"kubernetes.io/projected/24f6a8b1-589f-4953-917f-6e21345d3c38-kube-api-access-qh74h\") pod \"route-controller-manager-67cc8d88b-2lms8\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:23 crc kubenswrapper[4707]: I1127 16:09:23.955836 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gbr9\" (UniqueName: \"kubernetes.io/projected/286a1cd3-29f8-4539-be07-c4366889e170-kube-api-access-6gbr9\") pod \"controller-manager-856f74b64f-9lcrm\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:24 crc kubenswrapper[4707]: I1127 16:09:24.096697 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:24 crc kubenswrapper[4707]: I1127 16:09:24.111898 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:24 crc kubenswrapper[4707]: I1127 16:09:24.457432 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-9lcrm"] Nov 27 16:09:24 crc kubenswrapper[4707]: I1127 16:09:24.610941 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8"] Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.204029 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30208d57-8abd-4956-92a1-1b1aa21b754a" path="/var/lib/kubelet/pods/30208d57-8abd-4956-92a1-1b1aa21b754a/volumes" Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.205147 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b43ea736-a4b5-473d-a3a2-3d779a856a86" path="/var/lib/kubelet/pods/b43ea736-a4b5-473d-a3a2-3d779a856a86/volumes" Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.325689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" event={"ID":"24f6a8b1-589f-4953-917f-6e21345d3c38","Type":"ContainerStarted","Data":"014f1357812454a0844e637742ea67ff3a7820d580c0886e612acdf7def31431"} Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.325744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" event={"ID":"24f6a8b1-589f-4953-917f-6e21345d3c38","Type":"ContainerStarted","Data":"5af1a7e1537ab28141a19da871f9cd66d3ac4e11155e7a852a10510a2923518d"} Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.325939 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.328188 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" event={"ID":"286a1cd3-29f8-4539-be07-c4366889e170","Type":"ContainerStarted","Data":"68e7eb6af29f79e56be28b0d360828b30a55ff861aa90a9c962bef121b5a6635"} Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.328224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" event={"ID":"286a1cd3-29f8-4539-be07-c4366889e170","Type":"ContainerStarted","Data":"5cb8ffbc4e4bef4d580b3472479c7429e0001a2be39b6486a48306ba3f3d381a"} Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.328431 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.331744 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.333417 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.348359 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" podStartSLOduration=3.348331931 podStartE2EDuration="3.348331931s" podCreationTimestamp="2025-11-27 16:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:09:25.346899905 +0000 UTC m=+340.978348713" watchObservedRunningTime="2025-11-27 16:09:25.348331931 +0000 UTC m=+340.979780739" Nov 27 16:09:25 crc kubenswrapper[4707]: I1127 16:09:25.373128 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" podStartSLOduration=3.373104793 podStartE2EDuration="3.373104793s" podCreationTimestamp="2025-11-27 16:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:09:25.368021337 +0000 UTC m=+340.999470115" watchObservedRunningTime="2025-11-27 16:09:25.373104793 +0000 UTC m=+341.004553601" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.623787 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.624598 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.825300 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zn2pc"] Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.827453 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.835723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.836050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn2pc"] Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.864308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlvpp\" (UniqueName: \"kubernetes.io/projected/363ac0a9-1890-4420-9afd-5a2b2ead2c51-kube-api-access-vlvpp\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.864595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363ac0a9-1890-4420-9afd-5a2b2ead2c51-catalog-content\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.864721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363ac0a9-1890-4420-9afd-5a2b2ead2c51-utilities\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.966410 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363ac0a9-1890-4420-9afd-5a2b2ead2c51-utilities\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.966476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlvpp\" (UniqueName: \"kubernetes.io/projected/363ac0a9-1890-4420-9afd-5a2b2ead2c51-kube-api-access-vlvpp\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.966517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363ac0a9-1890-4420-9afd-5a2b2ead2c51-catalog-content\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.967046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363ac0a9-1890-4420-9afd-5a2b2ead2c51-catalog-content\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:33 crc kubenswrapper[4707]: I1127 16:09:33.967291 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363ac0a9-1890-4420-9afd-5a2b2ead2c51-utilities\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.000185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlvpp\" (UniqueName: \"kubernetes.io/projected/363ac0a9-1890-4420-9afd-5a2b2ead2c51-kube-api-access-vlvpp\") pod \"redhat-marketplace-zn2pc\" (UID: \"363ac0a9-1890-4420-9afd-5a2b2ead2c51\") " pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.018413 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zthlf"] Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.019590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.023257 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.038338 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zthlf"] Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.153078 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.188762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f2d660-6553-4b21-b002-7bbc56c701ea-catalog-content\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.188832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l58tn\" (UniqueName: \"kubernetes.io/projected/61f2d660-6553-4b21-b002-7bbc56c701ea-kube-api-access-l58tn\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.188879 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f2d660-6553-4b21-b002-7bbc56c701ea-utilities\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.291316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f2d660-6553-4b21-b002-7bbc56c701ea-catalog-content\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.291811 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l58tn\" (UniqueName: \"kubernetes.io/projected/61f2d660-6553-4b21-b002-7bbc56c701ea-kube-api-access-l58tn\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.291914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f2d660-6553-4b21-b002-7bbc56c701ea-utilities\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.292238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f2d660-6553-4b21-b002-7bbc56c701ea-catalog-content\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.293131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f2d660-6553-4b21-b002-7bbc56c701ea-utilities\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.329748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l58tn\" (UniqueName: \"kubernetes.io/projected/61f2d660-6553-4b21-b002-7bbc56c701ea-kube-api-access-l58tn\") pod \"redhat-operators-zthlf\" (UID: \"61f2d660-6553-4b21-b002-7bbc56c701ea\") " pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.402707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.597270 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zn2pc"] Nov 27 16:09:34 crc kubenswrapper[4707]: I1127 16:09:34.810909 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zthlf"] Nov 27 16:09:34 crc kubenswrapper[4707]: W1127 16:09:34.820666 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61f2d660_6553_4b21_b002_7bbc56c701ea.slice/crio-061b58b38c15f1b196ae193f858a0420a88a07586138bfddf0cf1687e6feb6cf WatchSource:0}: Error finding container 061b58b38c15f1b196ae193f858a0420a88a07586138bfddf0cf1687e6feb6cf: Status 404 returned error can't find the container with id 061b58b38c15f1b196ae193f858a0420a88a07586138bfddf0cf1687e6feb6cf Nov 27 16:09:35 crc kubenswrapper[4707]: I1127 16:09:35.395196 4707 generic.go:334] "Generic (PLEG): container finished" podID="61f2d660-6553-4b21-b002-7bbc56c701ea" containerID="51e386f5b18460f359af5fb6ddc0c45976d6fbb0f0b9b9e0a0629198aa75b072" exitCode=0 Nov 27 16:09:35 crc kubenswrapper[4707]: I1127 16:09:35.395305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zthlf" event={"ID":"61f2d660-6553-4b21-b002-7bbc56c701ea","Type":"ContainerDied","Data":"51e386f5b18460f359af5fb6ddc0c45976d6fbb0f0b9b9e0a0629198aa75b072"} Nov 27 16:09:35 crc kubenswrapper[4707]: I1127 16:09:35.395390 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zthlf" event={"ID":"61f2d660-6553-4b21-b002-7bbc56c701ea","Type":"ContainerStarted","Data":"061b58b38c15f1b196ae193f858a0420a88a07586138bfddf0cf1687e6feb6cf"} Nov 27 16:09:35 crc kubenswrapper[4707]: I1127 16:09:35.396949 4707 generic.go:334] "Generic (PLEG): container finished" podID="363ac0a9-1890-4420-9afd-5a2b2ead2c51" containerID="fb744b399b0ec54e1d4f9ce0964f33243353f7a8f5b727985b9ccf4dca09fa0b" exitCode=0 Nov 27 16:09:35 crc kubenswrapper[4707]: I1127 16:09:35.397004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn2pc" event={"ID":"363ac0a9-1890-4420-9afd-5a2b2ead2c51","Type":"ContainerDied","Data":"fb744b399b0ec54e1d4f9ce0964f33243353f7a8f5b727985b9ccf4dca09fa0b"} Nov 27 16:09:35 crc kubenswrapper[4707]: I1127 16:09:35.397052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn2pc" event={"ID":"363ac0a9-1890-4420-9afd-5a2b2ead2c51","Type":"ContainerStarted","Data":"8eb48d16466e1de745f751b8b379c59b8d84712e96b83a747e70e7a4f2d9344d"} Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.227510 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sb7xq"] Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.229586 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.232266 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.250973 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb7xq"] Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.417086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg874\" (UniqueName: \"kubernetes.io/projected/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-kube-api-access-rg874\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.417192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-utilities\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.417589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-catalog-content\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.422847 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hmrsj"] Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.424244 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.426868 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.438216 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmrsj"] Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.519964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mqr2\" (UniqueName: \"kubernetes.io/projected/c38b0f06-ab2f-48e1-9181-d9410e8896be-kube-api-access-5mqr2\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.520122 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-utilities\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.520186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c38b0f06-ab2f-48e1-9181-d9410e8896be-catalog-content\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.520229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-catalog-content\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.520268 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c38b0f06-ab2f-48e1-9181-d9410e8896be-utilities\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.520299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg874\" (UniqueName: \"kubernetes.io/projected/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-kube-api-access-rg874\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.520631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-utilities\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.520658 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-catalog-content\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.548009 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg874\" (UniqueName: \"kubernetes.io/projected/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-kube-api-access-rg874\") pod \"community-operators-sb7xq\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.568934 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.620759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c38b0f06-ab2f-48e1-9181-d9410e8896be-utilities\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.620813 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mqr2\" (UniqueName: \"kubernetes.io/projected/c38b0f06-ab2f-48e1-9181-d9410e8896be-kube-api-access-5mqr2\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.620870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c38b0f06-ab2f-48e1-9181-d9410e8896be-catalog-content\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.621418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c38b0f06-ab2f-48e1-9181-d9410e8896be-utilities\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.621482 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c38b0f06-ab2f-48e1-9181-d9410e8896be-catalog-content\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.642157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mqr2\" (UniqueName: \"kubernetes.io/projected/c38b0f06-ab2f-48e1-9181-d9410e8896be-kube-api-access-5mqr2\") pod \"certified-operators-hmrsj\" (UID: \"c38b0f06-ab2f-48e1-9181-d9410e8896be\") " pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:36 crc kubenswrapper[4707]: I1127 16:09:36.739830 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.192637 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb7xq"] Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.264972 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hmrsj"] Nov 27 16:09:37 crc kubenswrapper[4707]: W1127 16:09:37.280641 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc38b0f06_ab2f_48e1_9181_d9410e8896be.slice/crio-50336e5d183e121811b30bb071a499cd9fb6ebf7726dc35b424b6ac0f7c5fe3a WatchSource:0}: Error finding container 50336e5d183e121811b30bb071a499cd9fb6ebf7726dc35b424b6ac0f7c5fe3a: Status 404 returned error can't find the container with id 50336e5d183e121811b30bb071a499cd9fb6ebf7726dc35b424b6ac0f7c5fe3a Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.411692 4707 generic.go:334] "Generic (PLEG): container finished" podID="363ac0a9-1890-4420-9afd-5a2b2ead2c51" containerID="a029db1c326afbbad03cdf2e9cd9482d1ee0ef59e2a28c3d1584abc693ee1822" exitCode=0 Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.412791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn2pc" event={"ID":"363ac0a9-1890-4420-9afd-5a2b2ead2c51","Type":"ContainerDied","Data":"a029db1c326afbbad03cdf2e9cd9482d1ee0ef59e2a28c3d1584abc693ee1822"} Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.413753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmrsj" event={"ID":"c38b0f06-ab2f-48e1-9181-d9410e8896be","Type":"ContainerStarted","Data":"50336e5d183e121811b30bb071a499cd9fb6ebf7726dc35b424b6ac0f7c5fe3a"} Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.416867 4707 generic.go:334] "Generic (PLEG): container finished" podID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerID="2436268214a92c7bab6dceee35e87e8bb53d7869e46baae1302454089f32d4a9" exitCode=0 Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.416920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7xq" event={"ID":"a875a4be-fcc1-4b9e-b14e-5f0fc714639c","Type":"ContainerDied","Data":"2436268214a92c7bab6dceee35e87e8bb53d7869e46baae1302454089f32d4a9"} Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.416941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7xq" event={"ID":"a875a4be-fcc1-4b9e-b14e-5f0fc714639c","Type":"ContainerStarted","Data":"a74f64345f2091826d3c177c70752e45f32902972e49a5da7e62fa7aa44db154"} Nov 27 16:09:37 crc kubenswrapper[4707]: I1127 16:09:37.421700 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zthlf" event={"ID":"61f2d660-6553-4b21-b002-7bbc56c701ea","Type":"ContainerStarted","Data":"9dc3b46819865069c69629c2a6b4ae7fffd53381267a213bfc9424f908a7c3d2"} Nov 27 16:09:38 crc kubenswrapper[4707]: I1127 16:09:38.429987 4707 generic.go:334] "Generic (PLEG): container finished" podID="61f2d660-6553-4b21-b002-7bbc56c701ea" containerID="9dc3b46819865069c69629c2a6b4ae7fffd53381267a213bfc9424f908a7c3d2" exitCode=0 Nov 27 16:09:38 crc kubenswrapper[4707]: I1127 16:09:38.430113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zthlf" event={"ID":"61f2d660-6553-4b21-b002-7bbc56c701ea","Type":"ContainerDied","Data":"9dc3b46819865069c69629c2a6b4ae7fffd53381267a213bfc9424f908a7c3d2"} Nov 27 16:09:38 crc kubenswrapper[4707]: I1127 16:09:38.437837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zn2pc" event={"ID":"363ac0a9-1890-4420-9afd-5a2b2ead2c51","Type":"ContainerStarted","Data":"a4544b0c05b4fb4d8a6c400778a0dcc0b6a30d11cbf1f5686cce37028d5bd0f5"} Nov 27 16:09:38 crc kubenswrapper[4707]: I1127 16:09:38.439839 4707 generic.go:334] "Generic (PLEG): container finished" podID="c38b0f06-ab2f-48e1-9181-d9410e8896be" containerID="1614b8541e715e4274fd896b7761a5b5c61ad4656a1894060f7e8f927ebc80d2" exitCode=0 Nov 27 16:09:38 crc kubenswrapper[4707]: I1127 16:09:38.439882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmrsj" event={"ID":"c38b0f06-ab2f-48e1-9181-d9410e8896be","Type":"ContainerDied","Data":"1614b8541e715e4274fd896b7761a5b5c61ad4656a1894060f7e8f927ebc80d2"} Nov 27 16:09:38 crc kubenswrapper[4707]: I1127 16:09:38.524071 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zn2pc" podStartSLOduration=2.945260535 podStartE2EDuration="5.524023735s" podCreationTimestamp="2025-11-27 16:09:33 +0000 UTC" firstStartedPulling="2025-11-27 16:09:35.399090255 +0000 UTC m=+351.030539023" lastFinishedPulling="2025-11-27 16:09:37.977853425 +0000 UTC m=+353.609302223" observedRunningTime="2025-11-27 16:09:38.519986065 +0000 UTC m=+354.151434873" watchObservedRunningTime="2025-11-27 16:09:38.524023735 +0000 UTC m=+354.155472513" Nov 27 16:09:39 crc kubenswrapper[4707]: I1127 16:09:39.447613 4707 generic.go:334] "Generic (PLEG): container finished" podID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerID="17c0cefed402ffcff218aa6d85851adbe9d3432f15de54f2bf0abfea42623810" exitCode=0 Nov 27 16:09:39 crc kubenswrapper[4707]: I1127 16:09:39.447692 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7xq" event={"ID":"a875a4be-fcc1-4b9e-b14e-5f0fc714639c","Type":"ContainerDied","Data":"17c0cefed402ffcff218aa6d85851adbe9d3432f15de54f2bf0abfea42623810"} Nov 27 16:09:40 crc kubenswrapper[4707]: I1127 16:09:40.462392 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zthlf" event={"ID":"61f2d660-6553-4b21-b002-7bbc56c701ea","Type":"ContainerStarted","Data":"c6a2b7c4bd4859aab98592bbbea87866d8053e151bf8209ccecd2ce8781fdbdf"} Nov 27 16:09:40 crc kubenswrapper[4707]: I1127 16:09:40.464829 4707 generic.go:334] "Generic (PLEG): container finished" podID="c38b0f06-ab2f-48e1-9181-d9410e8896be" containerID="9f679c5a4041abfb64a12a0f0b2d9520ebd798865624b3dbb79c15ad7e494a94" exitCode=0 Nov 27 16:09:40 crc kubenswrapper[4707]: I1127 16:09:40.464867 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmrsj" event={"ID":"c38b0f06-ab2f-48e1-9181-d9410e8896be","Type":"ContainerDied","Data":"9f679c5a4041abfb64a12a0f0b2d9520ebd798865624b3dbb79c15ad7e494a94"} Nov 27 16:09:40 crc kubenswrapper[4707]: I1127 16:09:40.468303 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7xq" event={"ID":"a875a4be-fcc1-4b9e-b14e-5f0fc714639c","Type":"ContainerStarted","Data":"7aff2cfb12bf7c388208239d38004d97b94cfff4b0ec19c84f6cd38fab171961"} Nov 27 16:09:40 crc kubenswrapper[4707]: I1127 16:09:40.481098 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zthlf" podStartSLOduration=3.319013536 podStartE2EDuration="7.48107719s" podCreationTimestamp="2025-11-27 16:09:33 +0000 UTC" firstStartedPulling="2025-11-27 16:09:35.396474801 +0000 UTC m=+351.027923579" lastFinishedPulling="2025-11-27 16:09:39.558538445 +0000 UTC m=+355.189987233" observedRunningTime="2025-11-27 16:09:40.478483776 +0000 UTC m=+356.109932554" watchObservedRunningTime="2025-11-27 16:09:40.48107719 +0000 UTC m=+356.112525958" Nov 27 16:09:40 crc kubenswrapper[4707]: I1127 16:09:40.512451 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sb7xq" podStartSLOduration=2.054289173 podStartE2EDuration="4.512431904s" podCreationTimestamp="2025-11-27 16:09:36 +0000 UTC" firstStartedPulling="2025-11-27 16:09:37.419057364 +0000 UTC m=+353.050506142" lastFinishedPulling="2025-11-27 16:09:39.877200105 +0000 UTC m=+355.508648873" observedRunningTime="2025-11-27 16:09:40.508345303 +0000 UTC m=+356.139794071" watchObservedRunningTime="2025-11-27 16:09:40.512431904 +0000 UTC m=+356.143880672" Nov 27 16:09:41 crc kubenswrapper[4707]: I1127 16:09:41.482510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hmrsj" event={"ID":"c38b0f06-ab2f-48e1-9181-d9410e8896be","Type":"ContainerStarted","Data":"479925f3c9ff844fbc1e391cb046d2be48fdc13ea46b088dbefc4b5c21d70e0c"} Nov 27 16:09:42 crc kubenswrapper[4707]: I1127 16:09:42.311547 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hmrsj" podStartSLOduration=3.481976734 podStartE2EDuration="6.311518838s" podCreationTimestamp="2025-11-27 16:09:36 +0000 UTC" firstStartedPulling="2025-11-27 16:09:38.441470076 +0000 UTC m=+354.072918844" lastFinishedPulling="2025-11-27 16:09:41.27101217 +0000 UTC m=+356.902460948" observedRunningTime="2025-11-27 16:09:41.506636159 +0000 UTC m=+357.138084967" watchObservedRunningTime="2025-11-27 16:09:42.311518838 +0000 UTC m=+357.942967627" Nov 27 16:09:42 crc kubenswrapper[4707]: I1127 16:09:42.313647 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-9lcrm"] Nov 27 16:09:42 crc kubenswrapper[4707]: I1127 16:09:42.313938 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" podUID="286a1cd3-29f8-4539-be07-c4366889e170" containerName="controller-manager" containerID="cri-o://68e7eb6af29f79e56be28b0d360828b30a55ff861aa90a9c962bef121b5a6635" gracePeriod=30 Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.097643 4707 patch_prober.go:28] interesting pod/controller-manager-856f74b64f-9lcrm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.098069 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" podUID="286a1cd3-29f8-4539-be07-c4366889e170" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.154924 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.154995 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.222645 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.404230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.404303 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.503269 4707 generic.go:334] "Generic (PLEG): container finished" podID="286a1cd3-29f8-4539-be07-c4366889e170" containerID="68e7eb6af29f79e56be28b0d360828b30a55ff861aa90a9c962bef121b5a6635" exitCode=0 Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.504120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" event={"ID":"286a1cd3-29f8-4539-be07-c4366889e170","Type":"ContainerDied","Data":"68e7eb6af29f79e56be28b0d360828b30a55ff861aa90a9c962bef121b5a6635"} Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.550297 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zn2pc" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.787473 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.831228 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm"] Nov 27 16:09:44 crc kubenswrapper[4707]: E1127 16:09:44.831655 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="286a1cd3-29f8-4539-be07-c4366889e170" containerName="controller-manager" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.831675 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="286a1cd3-29f8-4539-be07-c4366889e170" containerName="controller-manager" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.831792 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="286a1cd3-29f8-4539-be07-c4366889e170" containerName="controller-manager" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.832545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.841301 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm"] Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.848181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-client-ca\") pod \"286a1cd3-29f8-4539-be07-c4366889e170\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.848245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-config\") pod \"286a1cd3-29f8-4539-be07-c4366889e170\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.848306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/286a1cd3-29f8-4539-be07-c4366889e170-serving-cert\") pod \"286a1cd3-29f8-4539-be07-c4366889e170\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.848391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-proxy-ca-bundles\") pod \"286a1cd3-29f8-4539-be07-c4366889e170\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.848459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gbr9\" (UniqueName: \"kubernetes.io/projected/286a1cd3-29f8-4539-be07-c4366889e170-kube-api-access-6gbr9\") pod \"286a1cd3-29f8-4539-be07-c4366889e170\" (UID: \"286a1cd3-29f8-4539-be07-c4366889e170\") " Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.850641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-config" (OuterVolumeSpecName: "config") pod "286a1cd3-29f8-4539-be07-c4366889e170" (UID: "286a1cd3-29f8-4539-be07-c4366889e170"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.851100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-client-ca" (OuterVolumeSpecName: "client-ca") pod "286a1cd3-29f8-4539-be07-c4366889e170" (UID: "286a1cd3-29f8-4539-be07-c4366889e170"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.853182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "286a1cd3-29f8-4539-be07-c4366889e170" (UID: "286a1cd3-29f8-4539-be07-c4366889e170"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.855188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/286a1cd3-29f8-4539-be07-c4366889e170-kube-api-access-6gbr9" (OuterVolumeSpecName: "kube-api-access-6gbr9") pod "286a1cd3-29f8-4539-be07-c4366889e170" (UID: "286a1cd3-29f8-4539-be07-c4366889e170"). InnerVolumeSpecName "kube-api-access-6gbr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.867530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/286a1cd3-29f8-4539-be07-c4366889e170-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "286a1cd3-29f8-4539-be07-c4366889e170" (UID: "286a1cd3-29f8-4539-be07-c4366889e170"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950238 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-client-ca\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf4sc\" (UniqueName: \"kubernetes.io/projected/75c539cf-b204-4667-99c5-84a503376430-kube-api-access-sf4sc\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-proxy-ca-bundles\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75c539cf-b204-4667-99c5-84a503376430-serving-cert\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-config\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950815 4707 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950834 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gbr9\" (UniqueName: \"kubernetes.io/projected/286a1cd3-29f8-4539-be07-c4366889e170-kube-api-access-6gbr9\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950850 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950862 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/286a1cd3-29f8-4539-be07-c4366889e170-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:44 crc kubenswrapper[4707]: I1127 16:09:44.950877 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/286a1cd3-29f8-4539-be07-c4366889e170-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.052345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-client-ca\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.052457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf4sc\" (UniqueName: \"kubernetes.io/projected/75c539cf-b204-4667-99c5-84a503376430-kube-api-access-sf4sc\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.052495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-proxy-ca-bundles\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.052527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75c539cf-b204-4667-99c5-84a503376430-serving-cert\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.052574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-config\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.054023 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-client-ca\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.054436 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-config\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.054855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/75c539cf-b204-4667-99c5-84a503376430-proxy-ca-bundles\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.058709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75c539cf-b204-4667-99c5-84a503376430-serving-cert\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.079334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf4sc\" (UniqueName: \"kubernetes.io/projected/75c539cf-b204-4667-99c5-84a503376430-kube-api-access-sf4sc\") pod \"controller-manager-5b599ffdbd-zcdjm\" (UID: \"75c539cf-b204-4667-99c5-84a503376430\") " pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.148909 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.458580 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm"] Nov 27 16:09:45 crc kubenswrapper[4707]: W1127 16:09:45.468511 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75c539cf_b204_4667_99c5_84a503376430.slice/crio-9f79f82bef7c1b1d60e4f36aba7a4af2f13ac0cfb5d7c920068ba5c2f8a08b8f WatchSource:0}: Error finding container 9f79f82bef7c1b1d60e4f36aba7a4af2f13ac0cfb5d7c920068ba5c2f8a08b8f: Status 404 returned error can't find the container with id 9f79f82bef7c1b1d60e4f36aba7a4af2f13ac0cfb5d7c920068ba5c2f8a08b8f Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.468570 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zthlf" podUID="61f2d660-6553-4b21-b002-7bbc56c701ea" containerName="registry-server" probeResult="failure" output=< Nov 27 16:09:45 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Nov 27 16:09:45 crc kubenswrapper[4707]: > Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.511434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" event={"ID":"75c539cf-b204-4667-99c5-84a503376430","Type":"ContainerStarted","Data":"9f79f82bef7c1b1d60e4f36aba7a4af2f13ac0cfb5d7c920068ba5c2f8a08b8f"} Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.514446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" event={"ID":"286a1cd3-29f8-4539-be07-c4366889e170","Type":"ContainerDied","Data":"5cb8ffbc4e4bef4d580b3472479c7429e0001a2be39b6486a48306ba3f3d381a"} Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.514637 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-856f74b64f-9lcrm" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.514788 4707 scope.go:117] "RemoveContainer" containerID="68e7eb6af29f79e56be28b0d360828b30a55ff861aa90a9c962bef121b5a6635" Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.552211 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-9lcrm"] Nov 27 16:09:45 crc kubenswrapper[4707]: I1127 16:09:45.555636 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-856f74b64f-9lcrm"] Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.520481 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" event={"ID":"75c539cf-b204-4667-99c5-84a503376430","Type":"ContainerStarted","Data":"c02d00a6da47d47bd5b7c0c066358e4ebc7b716095ff0c0163e9444f89d67848"} Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.520700 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.527918 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.548339 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b599ffdbd-zcdjm" podStartSLOduration=4.548321618 podStartE2EDuration="4.548321618s" podCreationTimestamp="2025-11-27 16:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:09:46.54675742 +0000 UTC m=+362.178206188" watchObservedRunningTime="2025-11-27 16:09:46.548321618 +0000 UTC m=+362.179770386" Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.569743 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.569960 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.638484 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.740074 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.740119 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:46 crc kubenswrapper[4707]: I1127 16:09:46.795448 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:47 crc kubenswrapper[4707]: I1127 16:09:47.202069 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="286a1cd3-29f8-4539-be07-c4366889e170" path="/var/lib/kubelet/pods/286a1cd3-29f8-4539-be07-c4366889e170/volumes" Nov 27 16:09:47 crc kubenswrapper[4707]: I1127 16:09:47.576058 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hmrsj" Nov 27 16:09:47 crc kubenswrapper[4707]: I1127 16:09:47.644894 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.395852 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-flczg"] Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.397271 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.409240 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-flczg"] Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.457451 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-bound-sa-token\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.457762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-ca-trust-extracted\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.457794 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-installation-pull-secrets\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.457844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-registry-tls\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.457870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wwq\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-kube-api-access-c5wwq\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.457901 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-registry-certificates\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.457940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-trusted-ca\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.457973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.491554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.558708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-registry-tls\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.558752 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wwq\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-kube-api-access-c5wwq\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.558774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-registry-certificates\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.558803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-trusted-ca\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.558844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-bound-sa-token\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.558865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-ca-trust-extracted\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.558883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-installation-pull-secrets\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.559695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-ca-trust-extracted\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.560190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-trusted-ca\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.560242 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-registry-certificates\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.566591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-registry-tls\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.566749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-installation-pull-secrets\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.576361 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wwq\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-kube-api-access-c5wwq\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.578115 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0db8b17a-5cd7-47fb-9972-93ebb9e4b767-bound-sa-token\") pod \"image-registry-66df7c8f76-flczg\" (UID: \"0db8b17a-5cd7-47fb-9972-93ebb9e4b767\") " pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:52 crc kubenswrapper[4707]: I1127 16:09:52.717250 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:53 crc kubenswrapper[4707]: I1127 16:09:53.171654 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-flczg"] Nov 27 16:09:53 crc kubenswrapper[4707]: I1127 16:09:53.565421 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-flczg" event={"ID":"0db8b17a-5cd7-47fb-9972-93ebb9e4b767","Type":"ContainerStarted","Data":"2b04509ed71af028b8dd1bd1c9766d3f655623a64fee8e3a41464790ccc652dc"} Nov 27 16:09:53 crc kubenswrapper[4707]: I1127 16:09:53.565470 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-flczg" event={"ID":"0db8b17a-5cd7-47fb-9972-93ebb9e4b767","Type":"ContainerStarted","Data":"c5853e483587e06c67218e1eed2782b3b32890ac8257cd1f4f673bf192f3a7e0"} Nov 27 16:09:53 crc kubenswrapper[4707]: I1127 16:09:53.566537 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:09:54 crc kubenswrapper[4707]: I1127 16:09:54.439822 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:09:54 crc kubenswrapper[4707]: I1127 16:09:54.472218 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-flczg" podStartSLOduration=2.472202422 podStartE2EDuration="2.472202422s" podCreationTimestamp="2025-11-27 16:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:09:53.594775652 +0000 UTC m=+369.226224420" watchObservedRunningTime="2025-11-27 16:09:54.472202422 +0000 UTC m=+370.103651190" Nov 27 16:09:54 crc kubenswrapper[4707]: I1127 16:09:54.477819 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zthlf" Nov 27 16:10:02 crc kubenswrapper[4707]: I1127 16:10:02.355811 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8"] Nov 27 16:10:02 crc kubenswrapper[4707]: I1127 16:10:02.356775 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" podUID="24f6a8b1-589f-4953-917f-6e21345d3c38" containerName="route-controller-manager" containerID="cri-o://014f1357812454a0844e637742ea67ff3a7820d580c0886e612acdf7def31431" gracePeriod=30 Nov 27 16:10:02 crc kubenswrapper[4707]: I1127 16:10:02.619762 4707 generic.go:334] "Generic (PLEG): container finished" podID="24f6a8b1-589f-4953-917f-6e21345d3c38" containerID="014f1357812454a0844e637742ea67ff3a7820d580c0886e612acdf7def31431" exitCode=0 Nov 27 16:10:02 crc kubenswrapper[4707]: I1127 16:10:02.619817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" event={"ID":"24f6a8b1-589f-4953-917f-6e21345d3c38","Type":"ContainerDied","Data":"014f1357812454a0844e637742ea67ff3a7820d580c0886e612acdf7def31431"} Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.401875 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.425755 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r"] Nov 27 16:10:03 crc kubenswrapper[4707]: E1127 16:10:03.426109 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f6a8b1-589f-4953-917f-6e21345d3c38" containerName="route-controller-manager" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.426135 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f6a8b1-589f-4953-917f-6e21345d3c38" containerName="route-controller-manager" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.426278 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f6a8b1-589f-4953-917f-6e21345d3c38" containerName="route-controller-manager" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.426724 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.438489 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r"] Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.512619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f6a8b1-589f-4953-917f-6e21345d3c38-serving-cert\") pod \"24f6a8b1-589f-4953-917f-6e21345d3c38\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.512664 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-config\") pod \"24f6a8b1-589f-4953-917f-6e21345d3c38\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.512725 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh74h\" (UniqueName: \"kubernetes.io/projected/24f6a8b1-589f-4953-917f-6e21345d3c38-kube-api-access-qh74h\") pod \"24f6a8b1-589f-4953-917f-6e21345d3c38\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.513477 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-config" (OuterVolumeSpecName: "config") pod "24f6a8b1-589f-4953-917f-6e21345d3c38" (UID: "24f6a8b1-589f-4953-917f-6e21345d3c38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.513572 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-client-ca\") pod \"24f6a8b1-589f-4953-917f-6e21345d3c38\" (UID: \"24f6a8b1-589f-4953-917f-6e21345d3c38\") " Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.513844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-client-ca" (OuterVolumeSpecName: "client-ca") pod "24f6a8b1-589f-4953-917f-6e21345d3c38" (UID: "24f6a8b1-589f-4953-917f-6e21345d3c38"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.514055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df7ef39c-2ee0-45d5-a403-311d6963822e-client-ca\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.514154 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df7ef39c-2ee0-45d5-a403-311d6963822e-serving-cert\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.514246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkngr\" (UniqueName: \"kubernetes.io/projected/df7ef39c-2ee0-45d5-a403-311d6963822e-kube-api-access-pkngr\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.514284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df7ef39c-2ee0-45d5-a403-311d6963822e-config\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.514344 4707 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.514359 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f6a8b1-589f-4953-917f-6e21345d3c38-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.517457 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f6a8b1-589f-4953-917f-6e21345d3c38-kube-api-access-qh74h" (OuterVolumeSpecName: "kube-api-access-qh74h") pod "24f6a8b1-589f-4953-917f-6e21345d3c38" (UID: "24f6a8b1-589f-4953-917f-6e21345d3c38"). InnerVolumeSpecName "kube-api-access-qh74h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.519568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f6a8b1-589f-4953-917f-6e21345d3c38-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "24f6a8b1-589f-4953-917f-6e21345d3c38" (UID: "24f6a8b1-589f-4953-917f-6e21345d3c38"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.615817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkngr\" (UniqueName: \"kubernetes.io/projected/df7ef39c-2ee0-45d5-a403-311d6963822e-kube-api-access-pkngr\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.615865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df7ef39c-2ee0-45d5-a403-311d6963822e-config\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.615910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df7ef39c-2ee0-45d5-a403-311d6963822e-client-ca\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.615946 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df7ef39c-2ee0-45d5-a403-311d6963822e-serving-cert\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.616014 4707 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f6a8b1-589f-4953-917f-6e21345d3c38-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.616027 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh74h\" (UniqueName: \"kubernetes.io/projected/24f6a8b1-589f-4953-917f-6e21345d3c38-kube-api-access-qh74h\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.617179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df7ef39c-2ee0-45d5-a403-311d6963822e-client-ca\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.617289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df7ef39c-2ee0-45d5-a403-311d6963822e-config\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.620293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df7ef39c-2ee0-45d5-a403-311d6963822e-serving-cert\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.623307 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.623343 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.633816 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkngr\" (UniqueName: \"kubernetes.io/projected/df7ef39c-2ee0-45d5-a403-311d6963822e-kube-api-access-pkngr\") pod \"route-controller-manager-7454b5fff7-tht6r\" (UID: \"df7ef39c-2ee0-45d5-a403-311d6963822e\") " pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.633955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" event={"ID":"24f6a8b1-589f-4953-917f-6e21345d3c38","Type":"ContainerDied","Data":"5af1a7e1537ab28141a19da871f9cd66d3ac4e11155e7a852a10510a2923518d"} Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.633998 4707 scope.go:117] "RemoveContainer" containerID="014f1357812454a0844e637742ea67ff3a7820d580c0886e612acdf7def31431" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.634156 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8" Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.668323 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8"] Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.671040 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67cc8d88b-2lms8"] Nov 27 16:10:03 crc kubenswrapper[4707]: I1127 16:10:03.753217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:04 crc kubenswrapper[4707]: I1127 16:10:04.266458 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r"] Nov 27 16:10:04 crc kubenswrapper[4707]: W1127 16:10:04.267645 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7ef39c_2ee0_45d5_a403_311d6963822e.slice/crio-877da9b93f818e5e423e6c214e5542208d03ccdc6eeff50d66d337316684ec48 WatchSource:0}: Error finding container 877da9b93f818e5e423e6c214e5542208d03ccdc6eeff50d66d337316684ec48: Status 404 returned error can't find the container with id 877da9b93f818e5e423e6c214e5542208d03ccdc6eeff50d66d337316684ec48 Nov 27 16:10:04 crc kubenswrapper[4707]: I1127 16:10:04.639708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" event={"ID":"df7ef39c-2ee0-45d5-a403-311d6963822e","Type":"ContainerStarted","Data":"4bdb83a882d9e9d38d205c32da799a298b18d8f1445c1a644c9b9242e38f07fc"} Nov 27 16:10:04 crc kubenswrapper[4707]: I1127 16:10:04.639974 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" event={"ID":"df7ef39c-2ee0-45d5-a403-311d6963822e","Type":"ContainerStarted","Data":"877da9b93f818e5e423e6c214e5542208d03ccdc6eeff50d66d337316684ec48"} Nov 27 16:10:04 crc kubenswrapper[4707]: I1127 16:10:04.642137 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:04 crc kubenswrapper[4707]: I1127 16:10:04.658601 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" podStartSLOduration=2.658589798 podStartE2EDuration="2.658589798s" podCreationTimestamp="2025-11-27 16:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:10:04.657353628 +0000 UTC m=+380.288802416" watchObservedRunningTime="2025-11-27 16:10:04.658589798 +0000 UTC m=+380.290038566" Nov 27 16:10:04 crc kubenswrapper[4707]: I1127 16:10:04.870741 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7454b5fff7-tht6r" Nov 27 16:10:05 crc kubenswrapper[4707]: I1127 16:10:05.209786 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f6a8b1-589f-4953-917f-6e21345d3c38" path="/var/lib/kubelet/pods/24f6a8b1-589f-4953-917f-6e21345d3c38/volumes" Nov 27 16:10:12 crc kubenswrapper[4707]: I1127 16:10:12.726191 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-flczg" Nov 27 16:10:12 crc kubenswrapper[4707]: I1127 16:10:12.800433 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwvst"] Nov 27 16:10:33 crc kubenswrapper[4707]: I1127 16:10:33.624062 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:10:33 crc kubenswrapper[4707]: I1127 16:10:33.624709 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:10:33 crc kubenswrapper[4707]: I1127 16:10:33.624765 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:10:33 crc kubenswrapper[4707]: I1127 16:10:33.625283 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31f688d6285990a2c90910fc7e1b42b1f5156cb6f243c758badf144c47b276ff"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:10:33 crc kubenswrapper[4707]: I1127 16:10:33.625361 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://31f688d6285990a2c90910fc7e1b42b1f5156cb6f243c758badf144c47b276ff" gracePeriod=600 Nov 27 16:10:33 crc kubenswrapper[4707]: I1127 16:10:33.848119 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="31f688d6285990a2c90910fc7e1b42b1f5156cb6f243c758badf144c47b276ff" exitCode=0 Nov 27 16:10:33 crc kubenswrapper[4707]: I1127 16:10:33.848233 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"31f688d6285990a2c90910fc7e1b42b1f5156cb6f243c758badf144c47b276ff"} Nov 27 16:10:33 crc kubenswrapper[4707]: I1127 16:10:33.848653 4707 scope.go:117] "RemoveContainer" containerID="6227d8aa794cce73ef7607c3e6cf8f853c829bf56493acf04c0b555ddd264ba5" Nov 27 16:10:34 crc kubenswrapper[4707]: I1127 16:10:34.859902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"3756a089c2c481a4f6e191b772b187f75506926ec4370423509a5187881583f9"} Nov 27 16:10:37 crc kubenswrapper[4707]: I1127 16:10:37.847232 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" podUID="b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" containerName="registry" containerID="cri-o://a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50" gracePeriod=30 Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.586589 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.599315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-installation-pull-secrets\") pod \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.599478 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-bound-sa-token\") pod \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.599555 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-ca-trust-extracted\") pod \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.599651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpqnb\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-kube-api-access-dpqnb\") pod \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.599730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-certificates\") pod \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.599828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-trusted-ca\") pod \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.599869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-tls\") pod \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.600057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\" (UID: \"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0\") " Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.602451 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.604639 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.611920 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.612667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-kube-api-access-dpqnb" (OuterVolumeSpecName: "kube-api-access-dpqnb") pod "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0"). InnerVolumeSpecName "kube-api-access-dpqnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.612933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.613301 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.627053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.649339 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" (UID: "b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.701698 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpqnb\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-kube-api-access-dpqnb\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.701741 4707 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.701755 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.701770 4707 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.701781 4707 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.701794 4707 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.701806 4707 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.896187 4707 generic.go:334] "Generic (PLEG): container finished" podID="b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" containerID="a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50" exitCode=0 Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.896281 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.896269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" event={"ID":"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0","Type":"ContainerDied","Data":"a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50"} Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.896877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwvst" event={"ID":"b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0","Type":"ContainerDied","Data":"1189d4d97de40294d3fba3e19f4890b28617a7ead30a5f7bfe5052d0ddb9c0d5"} Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.896915 4707 scope.go:117] "RemoveContainer" containerID="a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.924514 4707 scope.go:117] "RemoveContainer" containerID="a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50" Nov 27 16:10:38 crc kubenswrapper[4707]: E1127 16:10:38.925198 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50\": container with ID starting with a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50 not found: ID does not exist" containerID="a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.925309 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50"} err="failed to get container status \"a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50\": rpc error: code = NotFound desc = could not find container \"a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50\": container with ID starting with a95f24395e8e931982ca3f8a27913f86ea821531d010ca50f1f490bb01511a50 not found: ID does not exist" Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.948126 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwvst"] Nov 27 16:10:38 crc kubenswrapper[4707]: I1127 16:10:38.954700 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwvst"] Nov 27 16:10:39 crc kubenswrapper[4707]: I1127 16:10:39.206915 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" path="/var/lib/kubelet/pods/b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0/volumes" Nov 27 16:12:33 crc kubenswrapper[4707]: I1127 16:12:33.624305 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:12:33 crc kubenswrapper[4707]: I1127 16:12:33.625071 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:13:03 crc kubenswrapper[4707]: I1127 16:13:03.624135 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:13:03 crc kubenswrapper[4707]: I1127 16:13:03.624878 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:13:33 crc kubenswrapper[4707]: I1127 16:13:33.624146 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:13:33 crc kubenswrapper[4707]: I1127 16:13:33.625037 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:13:33 crc kubenswrapper[4707]: I1127 16:13:33.625111 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:13:33 crc kubenswrapper[4707]: I1127 16:13:33.626017 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3756a089c2c481a4f6e191b772b187f75506926ec4370423509a5187881583f9"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:13:33 crc kubenswrapper[4707]: I1127 16:13:33.626123 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://3756a089c2c481a4f6e191b772b187f75506926ec4370423509a5187881583f9" gracePeriod=600 Nov 27 16:13:34 crc kubenswrapper[4707]: I1127 16:13:34.117128 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="3756a089c2c481a4f6e191b772b187f75506926ec4370423509a5187881583f9" exitCode=0 Nov 27 16:13:34 crc kubenswrapper[4707]: I1127 16:13:34.117214 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"3756a089c2c481a4f6e191b772b187f75506926ec4370423509a5187881583f9"} Nov 27 16:13:34 crc kubenswrapper[4707]: I1127 16:13:34.118962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"596e4dd118b814e12fbaccbb4655af72f02c2baaf706c8463cf822841fdaa729"} Nov 27 16:13:34 crc kubenswrapper[4707]: I1127 16:13:34.119003 4707 scope.go:117] "RemoveContainer" containerID="31f688d6285990a2c90910fc7e1b42b1f5156cb6f243c758badf144c47b276ff" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.877833 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-7j6cd"] Nov 27 16:13:47 crc kubenswrapper[4707]: E1127 16:13:47.878894 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" containerName="registry" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.878917 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" containerName="registry" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.879101 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e0fd66-2bb6-4b8a-aaf0-e24490a2f1a0" containerName="registry" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.879852 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-7j6cd" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.885010 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.885402 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6mx84" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.887819 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nc56g"] Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.888446 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nc56g" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.892198 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-z7cpw" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.892321 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.895592 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-7j6cd"] Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.905777 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nc56g"] Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.912740 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5476f"] Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.914549 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.923448 4707 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-5n8mk" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.923520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97p6\" (UniqueName: \"kubernetes.io/projected/5f4b6258-1be2-489a-9e59-86df4534d663-kube-api-access-d97p6\") pod \"cert-manager-cainjector-7f985d654d-7j6cd\" (UID: \"5f4b6258-1be2-489a-9e59-86df4534d663\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-7j6cd" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.923595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7z2t\" (UniqueName: \"kubernetes.io/projected/4a4bd04a-ce38-46fe-a197-17214e851643-kube-api-access-n7z2t\") pod \"cert-manager-5b446d88c5-nc56g\" (UID: \"4a4bd04a-ce38-46fe-a197-17214e851643\") " pod="cert-manager/cert-manager-5b446d88c5-nc56g" Nov 27 16:13:47 crc kubenswrapper[4707]: I1127 16:13:47.941051 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5476f"] Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.025223 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllnt\" (UniqueName: \"kubernetes.io/projected/9efe65d7-bb46-43bd-a343-c9a28fbad2ea-kube-api-access-zllnt\") pod \"cert-manager-webhook-5655c58dd6-5476f\" (UID: \"9efe65d7-bb46-43bd-a343-c9a28fbad2ea\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.025285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7z2t\" (UniqueName: \"kubernetes.io/projected/4a4bd04a-ce38-46fe-a197-17214e851643-kube-api-access-n7z2t\") pod \"cert-manager-5b446d88c5-nc56g\" (UID: \"4a4bd04a-ce38-46fe-a197-17214e851643\") " pod="cert-manager/cert-manager-5b446d88c5-nc56g" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.025392 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97p6\" (UniqueName: \"kubernetes.io/projected/5f4b6258-1be2-489a-9e59-86df4534d663-kube-api-access-d97p6\") pod \"cert-manager-cainjector-7f985d654d-7j6cd\" (UID: \"5f4b6258-1be2-489a-9e59-86df4534d663\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-7j6cd" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.045222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7z2t\" (UniqueName: \"kubernetes.io/projected/4a4bd04a-ce38-46fe-a197-17214e851643-kube-api-access-n7z2t\") pod \"cert-manager-5b446d88c5-nc56g\" (UID: \"4a4bd04a-ce38-46fe-a197-17214e851643\") " pod="cert-manager/cert-manager-5b446d88c5-nc56g" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.045682 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97p6\" (UniqueName: \"kubernetes.io/projected/5f4b6258-1be2-489a-9e59-86df4534d663-kube-api-access-d97p6\") pod \"cert-manager-cainjector-7f985d654d-7j6cd\" (UID: \"5f4b6258-1be2-489a-9e59-86df4534d663\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-7j6cd" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.127028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllnt\" (UniqueName: \"kubernetes.io/projected/9efe65d7-bb46-43bd-a343-c9a28fbad2ea-kube-api-access-zllnt\") pod \"cert-manager-webhook-5655c58dd6-5476f\" (UID: \"9efe65d7-bb46-43bd-a343-c9a28fbad2ea\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.148848 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllnt\" (UniqueName: \"kubernetes.io/projected/9efe65d7-bb46-43bd-a343-c9a28fbad2ea-kube-api-access-zllnt\") pod \"cert-manager-webhook-5655c58dd6-5476f\" (UID: \"9efe65d7-bb46-43bd-a343-c9a28fbad2ea\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.204824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-7j6cd" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.211908 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nc56g" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.234904 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.629208 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nc56g"] Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.631478 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:13:48 crc kubenswrapper[4707]: W1127 16:13:48.680864 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f4b6258_1be2_489a_9e59_86df4534d663.slice/crio-0199f081fb34798a8d7a218399e4514a3b7e73b149a231ad5d005db5a9992b17 WatchSource:0}: Error finding container 0199f081fb34798a8d7a218399e4514a3b7e73b149a231ad5d005db5a9992b17: Status 404 returned error can't find the container with id 0199f081fb34798a8d7a218399e4514a3b7e73b149a231ad5d005db5a9992b17 Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.682504 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-7j6cd"] Nov 27 16:13:48 crc kubenswrapper[4707]: I1127 16:13:48.695417 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5476f"] Nov 27 16:13:48 crc kubenswrapper[4707]: W1127 16:13:48.698669 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9efe65d7_bb46_43bd_a343_c9a28fbad2ea.slice/crio-88687368f605035344e80467b2efda4da5afff3aa177c757c264015977b47e6a WatchSource:0}: Error finding container 88687368f605035344e80467b2efda4da5afff3aa177c757c264015977b47e6a: Status 404 returned error can't find the container with id 88687368f605035344e80467b2efda4da5afff3aa177c757c264015977b47e6a Nov 27 16:13:49 crc kubenswrapper[4707]: I1127 16:13:49.227218 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" event={"ID":"9efe65d7-bb46-43bd-a343-c9a28fbad2ea","Type":"ContainerStarted","Data":"88687368f605035344e80467b2efda4da5afff3aa177c757c264015977b47e6a"} Nov 27 16:13:49 crc kubenswrapper[4707]: I1127 16:13:49.228575 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-7j6cd" event={"ID":"5f4b6258-1be2-489a-9e59-86df4534d663","Type":"ContainerStarted","Data":"0199f081fb34798a8d7a218399e4514a3b7e73b149a231ad5d005db5a9992b17"} Nov 27 16:13:49 crc kubenswrapper[4707]: I1127 16:13:49.229860 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nc56g" event={"ID":"4a4bd04a-ce38-46fe-a197-17214e851643","Type":"ContainerStarted","Data":"dcf43d29d91c62bdac7cdd4cf88af4d3e01e04d3e22197264d0e35a52ae6d3e5"} Nov 27 16:13:52 crc kubenswrapper[4707]: I1127 16:13:52.259544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nc56g" event={"ID":"4a4bd04a-ce38-46fe-a197-17214e851643","Type":"ContainerStarted","Data":"879e4867238036efd4a3011997600a8b19f9ca0540e11a087bceebb5a78294c1"} Nov 27 16:13:52 crc kubenswrapper[4707]: I1127 16:13:52.268745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" event={"ID":"9efe65d7-bb46-43bd-a343-c9a28fbad2ea","Type":"ContainerStarted","Data":"16822620b4a990accf5753914682b112524e9015551cce8d1eddff3f30a83f24"} Nov 27 16:13:52 crc kubenswrapper[4707]: I1127 16:13:52.269297 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" Nov 27 16:13:52 crc kubenswrapper[4707]: I1127 16:13:52.334070 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-nc56g" podStartSLOduration=2.024980387 podStartE2EDuration="5.334039649s" podCreationTimestamp="2025-11-27 16:13:47 +0000 UTC" firstStartedPulling="2025-11-27 16:13:48.631185618 +0000 UTC m=+604.262634386" lastFinishedPulling="2025-11-27 16:13:51.94024485 +0000 UTC m=+607.571693648" observedRunningTime="2025-11-27 16:13:52.290412155 +0000 UTC m=+607.921860923" watchObservedRunningTime="2025-11-27 16:13:52.334039649 +0000 UTC m=+607.965488457" Nov 27 16:13:52 crc kubenswrapper[4707]: I1127 16:13:52.359073 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" podStartSLOduration=2.186516064 podStartE2EDuration="5.359048344s" podCreationTimestamp="2025-11-27 16:13:47 +0000 UTC" firstStartedPulling="2025-11-27 16:13:48.70136509 +0000 UTC m=+604.332813858" lastFinishedPulling="2025-11-27 16:13:51.87389737 +0000 UTC m=+607.505346138" observedRunningTime="2025-11-27 16:13:52.349630282 +0000 UTC m=+607.981079050" watchObservedRunningTime="2025-11-27 16:13:52.359048344 +0000 UTC m=+607.990497122" Nov 27 16:13:53 crc kubenswrapper[4707]: I1127 16:13:53.278886 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-7j6cd" event={"ID":"5f4b6258-1be2-489a-9e59-86df4534d663","Type":"ContainerStarted","Data":"e1a77e28ad27637cab969f858917a34dd73e25126127991f2ae6073db01fffb0"} Nov 27 16:13:53 crc kubenswrapper[4707]: I1127 16:13:53.304347 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-7j6cd" podStartSLOduration=1.945811581 podStartE2EDuration="6.304311635s" podCreationTimestamp="2025-11-27 16:13:47 +0000 UTC" firstStartedPulling="2025-11-27 16:13:48.683996178 +0000 UTC m=+604.315444956" lastFinishedPulling="2025-11-27 16:13:53.042496202 +0000 UTC m=+608.673945010" observedRunningTime="2025-11-27 16:13:53.299057272 +0000 UTC m=+608.930506080" watchObservedRunningTime="2025-11-27 16:13:53.304311635 +0000 UTC m=+608.935760443" Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.238333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-5476f" Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.650099 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xkmt7"] Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.650549 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovn-controller" containerID="cri-o://0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb" gracePeriod=30 Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.650907 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68" gracePeriod=30 Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.650630 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="nbdb" containerID="cri-o://44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077" gracePeriod=30 Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.650953 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="northd" containerID="cri-o://e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e" gracePeriod=30 Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.651054 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kube-rbac-proxy-node" containerID="cri-o://2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718" gracePeriod=30 Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.651035 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovn-acl-logging" containerID="cri-o://93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe" gracePeriod=30 Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.651012 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="sbdb" containerID="cri-o://db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0" gracePeriod=30 Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.696038 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" containerID="cri-o://333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b" gracePeriod=30 Nov 27 16:13:58 crc kubenswrapper[4707]: I1127 16:13:58.995717 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/3.log" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.000904 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovn-acl-logging/0.log" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.001557 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovn-controller/0.log" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.002071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066226 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kxkld"] Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066562 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066593 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066615 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="northd" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066628 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="northd" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066645 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovn-acl-logging" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066660 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovn-acl-logging" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066680 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovn-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066693 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovn-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066711 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="sbdb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066723 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="sbdb" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066743 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066756 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066790 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066805 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kube-rbac-proxy-node" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066818 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kube-rbac-proxy-node" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066837 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="nbdb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066849 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="nbdb" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066867 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066879 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066897 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kubecfg-setup" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066911 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kubecfg-setup" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.066925 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.066939 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067102 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovn-acl-logging" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067128 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="northd" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067146 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067163 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067176 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="nbdb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067192 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067211 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovn-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067228 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kube-rbac-proxy-node" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067241 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067255 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067273 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="sbdb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067294 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.067481 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.067496 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerName="ovnkube-controller" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.070677 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078038 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-etc-openvswitch\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-openvswitch\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078120 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-config\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-kubelet\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078159 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-systemd-units\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-env-overrides\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-node-log\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078269 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-systemd\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078271 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078297 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-node-log" (OuterVolumeSpecName: "node-log") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-bin\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078355 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-ovn-kubernetes\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6xmv\" (UniqueName: \"kubernetes.io/projected/55af9c67-18ce-46f1-a761-d11ce16f42d6-kube-api-access-p6xmv\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078435 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-netns\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078455 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-var-lib-openvswitch\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078459 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078477 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-log-socket\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078468 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078492 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078529 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-ovn\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078556 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovn-node-metrics-cert\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078572 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-log-socket" (OuterVolumeSpecName: "log-socket") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078546 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078582 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-script-lib\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078597 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078643 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-slash\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-netd\") pod \"55af9c67-18ce-46f1-a761-d11ce16f42d6\" (UID: \"55af9c67-18ce-46f1-a761-d11ce16f42d6\") " Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-slash" (OuterVolumeSpecName: "host-slash") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078826 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.078866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079025 4707 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079050 4707 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-slash\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079069 4707 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079089 4707 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079106 4707 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079123 4707 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079138 4707 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079154 4707 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079170 4707 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-node-log\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079187 4707 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079205 4707 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079225 4707 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079245 4707 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079261 4707 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079278 4707 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-log-socket\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.079477 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.086586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55af9c67-18ce-46f1-a761-d11ce16f42d6-kube-api-access-p6xmv" (OuterVolumeSpecName: "kube-api-access-p6xmv") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "kube-api-access-p6xmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.087870 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.107810 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "55af9c67-18ce-46f1-a761-d11ce16f42d6" (UID: "55af9c67-18ce-46f1-a761-d11ce16f42d6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-slash\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180139 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjq2d\" (UniqueName: \"kubernetes.io/projected/e1011045-347d-49fc-add6-74ef236a60f7-kube-api-access-jjq2d\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-cni-bin\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-run-netns\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-ovn\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-var-lib-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-ovnkube-script-lib\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180682 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1011045-347d-49fc-add6-74ef236a60f7-ovn-node-metrics-cert\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-kubelet\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-systemd-units\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.180940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-log-socket\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-node-log\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-cni-netd\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-env-overrides\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-systemd\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-ovnkube-config\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-etc-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181425 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181446 4707 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55af9c67-18ce-46f1-a761-d11ce16f42d6-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181459 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6xmv\" (UniqueName: \"kubernetes.io/projected/55af9c67-18ce-46f1-a761-d11ce16f42d6-kube-api-access-p6xmv\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181473 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.181484 4707 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55af9c67-18ce-46f1-a761-d11ce16f42d6-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-ovnkube-script-lib\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-kubelet\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1011045-347d-49fc-add6-74ef236a60f7-ovn-node-metrics-cert\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-systemd-units\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-log-socket\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-node-log\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-cni-netd\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-log-socket\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-env-overrides\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-systemd\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283602 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-systemd-units\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283618 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-ovnkube-config\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-etc-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-slash\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284434 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284532 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjq2d\" (UniqueName: \"kubernetes.io/projected/e1011045-347d-49fc-add6-74ef236a60f7-kube-api-access-jjq2d\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-cni-bin\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284662 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-run-netns\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-ovn\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.283445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-kubelet\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-ovnkube-script-lib\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-ovn\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.284957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-node-log\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-etc-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-var-lib-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285055 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-slash\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285206 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-var-lib-openvswitch\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-cni-netd\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285520 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-run-systemd\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285561 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-cni-bin\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.285743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e1011045-347d-49fc-add6-74ef236a60f7-host-run-netns\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.286223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-env-overrides\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.287311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e1011045-347d-49fc-add6-74ef236a60f7-ovnkube-config\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.288182 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e1011045-347d-49fc-add6-74ef236a60f7-ovn-node-metrics-cert\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.317084 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjq2d\" (UniqueName: \"kubernetes.io/projected/e1011045-347d-49fc-add6-74ef236a60f7-kube-api-access-jjq2d\") pod \"ovnkube-node-kxkld\" (UID: \"e1011045-347d-49fc-add6-74ef236a60f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.321576 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/2.log" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.322656 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/1.log" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.322754 4707 generic.go:334] "Generic (PLEG): container finished" podID="9ca48c08-f39d-41a2-847a-c893a2111492" containerID="3296b907d541dc79acbf2d75abe4ced1851608091496a03fc4a85f1879a836c6" exitCode=2 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.322874 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-js6mm" event={"ID":"9ca48c08-f39d-41a2-847a-c893a2111492","Type":"ContainerDied","Data":"3296b907d541dc79acbf2d75abe4ced1851608091496a03fc4a85f1879a836c6"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.322969 4707 scope.go:117] "RemoveContainer" containerID="094325a9e2d642c065e7c06cf63ba0fcad78c5220368b42504c2e87735d4a542" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.323634 4707 scope.go:117] "RemoveContainer" containerID="3296b907d541dc79acbf2d75abe4ced1851608091496a03fc4a85f1879a836c6" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.323937 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-js6mm_openshift-multus(9ca48c08-f39d-41a2-847a-c893a2111492)\"" pod="openshift-multus/multus-js6mm" podUID="9ca48c08-f39d-41a2-847a-c893a2111492" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.329544 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovnkube-controller/3.log" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.334455 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovn-acl-logging/0.log" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.335317 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xkmt7_55af9c67-18ce-46f1-a761-d11ce16f42d6/ovn-controller/0.log" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337270 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b" exitCode=0 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337318 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0" exitCode=0 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337337 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077" exitCode=0 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337352 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e" exitCode=0 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337399 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68" exitCode=0 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337413 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718" exitCode=0 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337426 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe" exitCode=143 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337439 4707 generic.go:334] "Generic (PLEG): container finished" podID="55af9c67-18ce-46f1-a761-d11ce16f42d6" containerID="0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb" exitCode=143 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337544 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337622 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337639 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337651 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337662 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337703 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337715 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337810 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.337816 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.338877 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.338902 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.338915 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.338943 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.338973 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.338986 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.338997 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339008 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339019 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339030 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339041 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339053 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339063 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339073 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339105 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339119 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339130 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339140 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339150 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339161 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339171 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339181 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339191 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339201 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xkmt7" event={"ID":"55af9c67-18ce-46f1-a761-d11ce16f42d6","Type":"ContainerDied","Data":"eb31900fe2142123c7385fe1e5b81150e1ae53282a92d68ddfb6be40f9dc3e46"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339232 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339249 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339260 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339270 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339280 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339290 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339306 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339316 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339326 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.339336 4707 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.378722 4707 scope.go:117] "RemoveContainer" containerID="333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.384417 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xkmt7"] Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.388793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.392238 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xkmt7"] Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.413114 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:13:59 crc kubenswrapper[4707]: W1127 16:13:59.425045 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1011045_347d_49fc_add6_74ef236a60f7.slice/crio-66aeabb711738397b5d01f04a47c9ac1ede5160e8726723c31e8abaf2a0089e3 WatchSource:0}: Error finding container 66aeabb711738397b5d01f04a47c9ac1ede5160e8726723c31e8abaf2a0089e3: Status 404 returned error can't find the container with id 66aeabb711738397b5d01f04a47c9ac1ede5160e8726723c31e8abaf2a0089e3 Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.437735 4707 scope.go:117] "RemoveContainer" containerID="db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.470678 4707 scope.go:117] "RemoveContainer" containerID="44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.492814 4707 scope.go:117] "RemoveContainer" containerID="e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.515007 4707 scope.go:117] "RemoveContainer" containerID="3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.544132 4707 scope.go:117] "RemoveContainer" containerID="2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.572529 4707 scope.go:117] "RemoveContainer" containerID="93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.597057 4707 scope.go:117] "RemoveContainer" containerID="0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.670601 4707 scope.go:117] "RemoveContainer" containerID="c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.696193 4707 scope.go:117] "RemoveContainer" containerID="333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.696672 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": container with ID starting with 333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b not found: ID does not exist" containerID="333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.696723 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} err="failed to get container status \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": rpc error: code = NotFound desc = could not find container \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": container with ID starting with 333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.696757 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.697266 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": container with ID starting with eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27 not found: ID does not exist" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.697303 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} err="failed to get container status \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": rpc error: code = NotFound desc = could not find container \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": container with ID starting with eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.697332 4707 scope.go:117] "RemoveContainer" containerID="db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.697866 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": container with ID starting with db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0 not found: ID does not exist" containerID="db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.697928 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} err="failed to get container status \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": rpc error: code = NotFound desc = could not find container \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": container with ID starting with db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.697975 4707 scope.go:117] "RemoveContainer" containerID="44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.698531 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": container with ID starting with 44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077 not found: ID does not exist" containerID="44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.698573 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} err="failed to get container status \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": rpc error: code = NotFound desc = could not find container \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": container with ID starting with 44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.698597 4707 scope.go:117] "RemoveContainer" containerID="e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.699108 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": container with ID starting with e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e not found: ID does not exist" containerID="e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.699147 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} err="failed to get container status \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": rpc error: code = NotFound desc = could not find container \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": container with ID starting with e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.699175 4707 scope.go:117] "RemoveContainer" containerID="3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.699612 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": container with ID starting with 3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68 not found: ID does not exist" containerID="3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.699653 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} err="failed to get container status \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": rpc error: code = NotFound desc = could not find container \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": container with ID starting with 3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.699682 4707 scope.go:117] "RemoveContainer" containerID="2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.700034 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": container with ID starting with 2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718 not found: ID does not exist" containerID="2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.700066 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} err="failed to get container status \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": rpc error: code = NotFound desc = could not find container \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": container with ID starting with 2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.700094 4707 scope.go:117] "RemoveContainer" containerID="93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.700518 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": container with ID starting with 93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe not found: ID does not exist" containerID="93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.700575 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} err="failed to get container status \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": rpc error: code = NotFound desc = could not find container \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": container with ID starting with 93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.700617 4707 scope.go:117] "RemoveContainer" containerID="0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.701239 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": container with ID starting with 0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb not found: ID does not exist" containerID="0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.701288 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} err="failed to get container status \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": rpc error: code = NotFound desc = could not find container \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": container with ID starting with 0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.701319 4707 scope.go:117] "RemoveContainer" containerID="c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d" Nov 27 16:13:59 crc kubenswrapper[4707]: E1127 16:13:59.701793 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": container with ID starting with c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d not found: ID does not exist" containerID="c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.701835 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} err="failed to get container status \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": rpc error: code = NotFound desc = could not find container \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": container with ID starting with c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.701862 4707 scope.go:117] "RemoveContainer" containerID="333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.702230 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} err="failed to get container status \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": rpc error: code = NotFound desc = could not find container \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": container with ID starting with 333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.702266 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.702661 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} err="failed to get container status \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": rpc error: code = NotFound desc = could not find container \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": container with ID starting with eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.702702 4707 scope.go:117] "RemoveContainer" containerID="db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.703067 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} err="failed to get container status \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": rpc error: code = NotFound desc = could not find container \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": container with ID starting with db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.703107 4707 scope.go:117] "RemoveContainer" containerID="44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.703500 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} err="failed to get container status \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": rpc error: code = NotFound desc = could not find container \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": container with ID starting with 44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.703535 4707 scope.go:117] "RemoveContainer" containerID="e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.703820 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} err="failed to get container status \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": rpc error: code = NotFound desc = could not find container \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": container with ID starting with e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.703855 4707 scope.go:117] "RemoveContainer" containerID="3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.704170 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} err="failed to get container status \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": rpc error: code = NotFound desc = could not find container \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": container with ID starting with 3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.704215 4707 scope.go:117] "RemoveContainer" containerID="2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.704603 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} err="failed to get container status \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": rpc error: code = NotFound desc = could not find container \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": container with ID starting with 2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.704639 4707 scope.go:117] "RemoveContainer" containerID="93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.704930 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} err="failed to get container status \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": rpc error: code = NotFound desc = could not find container \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": container with ID starting with 93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.704970 4707 scope.go:117] "RemoveContainer" containerID="0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.705341 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} err="failed to get container status \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": rpc error: code = NotFound desc = could not find container \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": container with ID starting with 0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.705414 4707 scope.go:117] "RemoveContainer" containerID="c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.705828 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} err="failed to get container status \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": rpc error: code = NotFound desc = could not find container \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": container with ID starting with c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.705866 4707 scope.go:117] "RemoveContainer" containerID="333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.706416 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} err="failed to get container status \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": rpc error: code = NotFound desc = could not find container \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": container with ID starting with 333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.706504 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.706945 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} err="failed to get container status \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": rpc error: code = NotFound desc = could not find container \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": container with ID starting with eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.707009 4707 scope.go:117] "RemoveContainer" containerID="db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.707473 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} err="failed to get container status \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": rpc error: code = NotFound desc = could not find container \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": container with ID starting with db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.707514 4707 scope.go:117] "RemoveContainer" containerID="44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.707858 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} err="failed to get container status \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": rpc error: code = NotFound desc = could not find container \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": container with ID starting with 44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.707895 4707 scope.go:117] "RemoveContainer" containerID="e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.708263 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} err="failed to get container status \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": rpc error: code = NotFound desc = could not find container \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": container with ID starting with e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.708305 4707 scope.go:117] "RemoveContainer" containerID="3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.708757 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} err="failed to get container status \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": rpc error: code = NotFound desc = could not find container \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": container with ID starting with 3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.708794 4707 scope.go:117] "RemoveContainer" containerID="2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.709165 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} err="failed to get container status \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": rpc error: code = NotFound desc = could not find container \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": container with ID starting with 2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.709222 4707 scope.go:117] "RemoveContainer" containerID="93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.709717 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} err="failed to get container status \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": rpc error: code = NotFound desc = could not find container \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": container with ID starting with 93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.709751 4707 scope.go:117] "RemoveContainer" containerID="0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.710069 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} err="failed to get container status \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": rpc error: code = NotFound desc = could not find container \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": container with ID starting with 0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.710123 4707 scope.go:117] "RemoveContainer" containerID="c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.710582 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} err="failed to get container status \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": rpc error: code = NotFound desc = could not find container \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": container with ID starting with c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.710619 4707 scope.go:117] "RemoveContainer" containerID="333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.711019 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b"} err="failed to get container status \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": rpc error: code = NotFound desc = could not find container \"333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b\": container with ID starting with 333494a77c14f7de080c8b9a6d70e630af6761c0406b730640c08ffc7785c54b not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.711059 4707 scope.go:117] "RemoveContainer" containerID="eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.711805 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27"} err="failed to get container status \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": rpc error: code = NotFound desc = could not find container \"eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27\": container with ID starting with eec0d7ba21b34dfd1c2f196273fe7ab74d1f1d2ce8c0a31e345dc47ce1e25d27 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.711848 4707 scope.go:117] "RemoveContainer" containerID="db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.712202 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0"} err="failed to get container status \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": rpc error: code = NotFound desc = could not find container \"db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0\": container with ID starting with db3362aeb65b368d2d9181cc898270fffb60938d0e9d32a862130d06a9f572a0 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.712238 4707 scope.go:117] "RemoveContainer" containerID="44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.712667 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077"} err="failed to get container status \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": rpc error: code = NotFound desc = could not find container \"44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077\": container with ID starting with 44699a190cf1065e537f588d7479a10aab8fd4ffd1df2348c616d999ffdf7077 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.712699 4707 scope.go:117] "RemoveContainer" containerID="e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.713064 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e"} err="failed to get container status \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": rpc error: code = NotFound desc = could not find container \"e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e\": container with ID starting with e6432d53914cee9e016c2134d99e48fa9ff1c05c0b88033992c2e0c70fc40d2e not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.713105 4707 scope.go:117] "RemoveContainer" containerID="3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.713462 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68"} err="failed to get container status \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": rpc error: code = NotFound desc = could not find container \"3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68\": container with ID starting with 3718d2f738ef780c738566c163d52afe355de6b8eb6004aa8e1dd5f37aa84d68 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.713570 4707 scope.go:117] "RemoveContainer" containerID="2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.713954 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718"} err="failed to get container status \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": rpc error: code = NotFound desc = could not find container \"2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718\": container with ID starting with 2a4eadca4b464dec02ca7aba6bc23d95b9c3965294d4b0224299279c630f3718 not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.713992 4707 scope.go:117] "RemoveContainer" containerID="93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.714315 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe"} err="failed to get container status \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": rpc error: code = NotFound desc = could not find container \"93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe\": container with ID starting with 93825fe5d9cf2d63c3e723b063f417f7e92442b14365852f935a06d316f443fe not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.714357 4707 scope.go:117] "RemoveContainer" containerID="0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.714798 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb"} err="failed to get container status \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": rpc error: code = NotFound desc = could not find container \"0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb\": container with ID starting with 0ae3fab13b078702f87ca8c586c9592c4554c95ead662d81ca65b50f368b6dcb not found: ID does not exist" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.714835 4707 scope.go:117] "RemoveContainer" containerID="c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d" Nov 27 16:13:59 crc kubenswrapper[4707]: I1127 16:13:59.715190 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d"} err="failed to get container status \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": rpc error: code = NotFound desc = could not find container \"c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d\": container with ID starting with c327fee553e5575dec76bc4456bc59424c89a9eea360cd4b23b594eff4c67a8d not found: ID does not exist" Nov 27 16:14:00 crc kubenswrapper[4707]: I1127 16:14:00.347124 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/2.log" Nov 27 16:14:00 crc kubenswrapper[4707]: I1127 16:14:00.351055 4707 generic.go:334] "Generic (PLEG): container finished" podID="e1011045-347d-49fc-add6-74ef236a60f7" containerID="174156ca46c8b8f34eef8089bcd2219fc38e8ae9c7205a5885780faaf8433b0f" exitCode=0 Nov 27 16:14:00 crc kubenswrapper[4707]: I1127 16:14:00.351120 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerDied","Data":"174156ca46c8b8f34eef8089bcd2219fc38e8ae9c7205a5885780faaf8433b0f"} Nov 27 16:14:00 crc kubenswrapper[4707]: I1127 16:14:00.351168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"66aeabb711738397b5d01f04a47c9ac1ede5160e8726723c31e8abaf2a0089e3"} Nov 27 16:14:01 crc kubenswrapper[4707]: I1127 16:14:01.201788 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55af9c67-18ce-46f1-a761-d11ce16f42d6" path="/var/lib/kubelet/pods/55af9c67-18ce-46f1-a761-d11ce16f42d6/volumes" Nov 27 16:14:01 crc kubenswrapper[4707]: I1127 16:14:01.361413 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"0370c448455a1901b14dda4ecd5f67c9597d23ac615dbc02f94921d22c3356aa"} Nov 27 16:14:01 crc kubenswrapper[4707]: I1127 16:14:01.361473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"a6a96945b43b9fc6d1f0da0226aee8245516364d534c079c7c91f590d583ff79"} Nov 27 16:14:01 crc kubenswrapper[4707]: I1127 16:14:01.361498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"428b8ca550d350f9af0ecbbc818c749f6a6012a736ce0c87f2b406f5a0087ae6"} Nov 27 16:14:01 crc kubenswrapper[4707]: I1127 16:14:01.361518 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"d7b169b4cfceff7aaa8fabaaed34aca08052d53219b502fbb28d58c03e1be854"} Nov 27 16:14:01 crc kubenswrapper[4707]: I1127 16:14:01.361535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"a9c82a2c7fbc4ea67819c229dad29d8b067df6435e1e08c37ff383b1e533e9e6"} Nov 27 16:14:01 crc kubenswrapper[4707]: I1127 16:14:01.361551 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"44d86b60197a582f24e47fb9d10f6013756e8355ddd064293102024bae9017e8"} Nov 27 16:14:04 crc kubenswrapper[4707]: I1127 16:14:04.391430 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"c536eab4f33a4b8b651ac092d54e7b6702c988fbde559c780324beff6751919c"} Nov 27 16:14:06 crc kubenswrapper[4707]: I1127 16:14:06.411276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" event={"ID":"e1011045-347d-49fc-add6-74ef236a60f7","Type":"ContainerStarted","Data":"6b329e46c45e0e818601bf2e8869631c33e5be87a20070a7d2a8630068b2c1a6"} Nov 27 16:14:06 crc kubenswrapper[4707]: I1127 16:14:06.411847 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:14:06 crc kubenswrapper[4707]: I1127 16:14:06.411866 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:14:06 crc kubenswrapper[4707]: I1127 16:14:06.411878 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:14:06 crc kubenswrapper[4707]: I1127 16:14:06.447731 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:14:06 crc kubenswrapper[4707]: I1127 16:14:06.449026 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:14:06 crc kubenswrapper[4707]: I1127 16:14:06.454117 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" podStartSLOduration=7.454102376 podStartE2EDuration="7.454102376s" podCreationTimestamp="2025-11-27 16:13:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:14:06.450357676 +0000 UTC m=+622.081806474" watchObservedRunningTime="2025-11-27 16:14:06.454102376 +0000 UTC m=+622.085551154" Nov 27 16:14:12 crc kubenswrapper[4707]: I1127 16:14:12.195493 4707 scope.go:117] "RemoveContainer" containerID="3296b907d541dc79acbf2d75abe4ced1851608091496a03fc4a85f1879a836c6" Nov 27 16:14:12 crc kubenswrapper[4707]: E1127 16:14:12.196702 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-js6mm_openshift-multus(9ca48c08-f39d-41a2-847a-c893a2111492)\"" pod="openshift-multus/multus-js6mm" podUID="9ca48c08-f39d-41a2-847a-c893a2111492" Nov 27 16:14:26 crc kubenswrapper[4707]: I1127 16:14:26.195477 4707 scope.go:117] "RemoveContainer" containerID="3296b907d541dc79acbf2d75abe4ced1851608091496a03fc4a85f1879a836c6" Nov 27 16:14:26 crc kubenswrapper[4707]: I1127 16:14:26.565584 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-js6mm_9ca48c08-f39d-41a2-847a-c893a2111492/kube-multus/2.log" Nov 27 16:14:26 crc kubenswrapper[4707]: I1127 16:14:26.566172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-js6mm" event={"ID":"9ca48c08-f39d-41a2-847a-c893a2111492","Type":"ContainerStarted","Data":"9b2a60f3a7cd537a12e46c19a16319e123f2ea2ced30e8c92c132ec5193d42bb"} Nov 27 16:14:29 crc kubenswrapper[4707]: I1127 16:14:29.423803 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxkld" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.628165 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5"] Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.629338 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.631843 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.646841 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5"] Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.721550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.721622 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2hcn\" (UniqueName: \"kubernetes.io/projected/367eee2c-7dc3-4a7e-a943-131037e46ca1-kube-api-access-d2hcn\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.721661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.823081 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.823187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.823285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2hcn\" (UniqueName: \"kubernetes.io/projected/367eee2c-7dc3-4a7e-a943-131037e46ca1-kube-api-access-d2hcn\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.823623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.824098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.858843 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2hcn\" (UniqueName: \"kubernetes.io/projected/367eee2c-7dc3-4a7e-a943-131037e46ca1-kube-api-access-d2hcn\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:36 crc kubenswrapper[4707]: I1127 16:14:36.952552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:37 crc kubenswrapper[4707]: I1127 16:14:37.241875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5"] Nov 27 16:14:37 crc kubenswrapper[4707]: I1127 16:14:37.653082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" event={"ID":"367eee2c-7dc3-4a7e-a943-131037e46ca1","Type":"ContainerStarted","Data":"f9cad236e97d41bc9f51d3b582010b72029a98b4b9802444bdec1eaaa3bdcdb2"} Nov 27 16:14:37 crc kubenswrapper[4707]: I1127 16:14:37.653542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" event={"ID":"367eee2c-7dc3-4a7e-a943-131037e46ca1","Type":"ContainerStarted","Data":"e73617c757367186938aac79a3f7ac49c5282c81601a8251c5d0d9f8388a797b"} Nov 27 16:14:38 crc kubenswrapper[4707]: I1127 16:14:38.659817 4707 generic.go:334] "Generic (PLEG): container finished" podID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerID="f9cad236e97d41bc9f51d3b582010b72029a98b4b9802444bdec1eaaa3bdcdb2" exitCode=0 Nov 27 16:14:38 crc kubenswrapper[4707]: I1127 16:14:38.659853 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" event={"ID":"367eee2c-7dc3-4a7e-a943-131037e46ca1","Type":"ContainerDied","Data":"f9cad236e97d41bc9f51d3b582010b72029a98b4b9802444bdec1eaaa3bdcdb2"} Nov 27 16:14:40 crc kubenswrapper[4707]: I1127 16:14:40.678593 4707 generic.go:334] "Generic (PLEG): container finished" podID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerID="934806b6e971eb416e026f19dd5f19fdc4a780d04cff48c57648fc5e865660f8" exitCode=0 Nov 27 16:14:40 crc kubenswrapper[4707]: I1127 16:14:40.678655 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" event={"ID":"367eee2c-7dc3-4a7e-a943-131037e46ca1","Type":"ContainerDied","Data":"934806b6e971eb416e026f19dd5f19fdc4a780d04cff48c57648fc5e865660f8"} Nov 27 16:14:41 crc kubenswrapper[4707]: I1127 16:14:41.688365 4707 generic.go:334] "Generic (PLEG): container finished" podID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerID="a795f273275fb0d244c18616b1c63613bdbe0a63c13809e32604d59c994c71cc" exitCode=0 Nov 27 16:14:41 crc kubenswrapper[4707]: I1127 16:14:41.688508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" event={"ID":"367eee2c-7dc3-4a7e-a943-131037e46ca1","Type":"ContainerDied","Data":"a795f273275fb0d244c18616b1c63613bdbe0a63c13809e32604d59c994c71cc"} Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.009753 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.113912 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-bundle\") pod \"367eee2c-7dc3-4a7e-a943-131037e46ca1\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.113987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2hcn\" (UniqueName: \"kubernetes.io/projected/367eee2c-7dc3-4a7e-a943-131037e46ca1-kube-api-access-d2hcn\") pod \"367eee2c-7dc3-4a7e-a943-131037e46ca1\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.114170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-util\") pod \"367eee2c-7dc3-4a7e-a943-131037e46ca1\" (UID: \"367eee2c-7dc3-4a7e-a943-131037e46ca1\") " Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.115867 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-bundle" (OuterVolumeSpecName: "bundle") pod "367eee2c-7dc3-4a7e-a943-131037e46ca1" (UID: "367eee2c-7dc3-4a7e-a943-131037e46ca1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.123041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367eee2c-7dc3-4a7e-a943-131037e46ca1-kube-api-access-d2hcn" (OuterVolumeSpecName: "kube-api-access-d2hcn") pod "367eee2c-7dc3-4a7e-a943-131037e46ca1" (UID: "367eee2c-7dc3-4a7e-a943-131037e46ca1"). InnerVolumeSpecName "kube-api-access-d2hcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.216485 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.216769 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2hcn\" (UniqueName: \"kubernetes.io/projected/367eee2c-7dc3-4a7e-a943-131037e46ca1-kube-api-access-d2hcn\") on node \"crc\" DevicePath \"\"" Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.278794 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-util" (OuterVolumeSpecName: "util") pod "367eee2c-7dc3-4a7e-a943-131037e46ca1" (UID: "367eee2c-7dc3-4a7e-a943-131037e46ca1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.318655 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/367eee2c-7dc3-4a7e-a943-131037e46ca1-util\") on node \"crc\" DevicePath \"\"" Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.707003 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" event={"ID":"367eee2c-7dc3-4a7e-a943-131037e46ca1","Type":"ContainerDied","Data":"e73617c757367186938aac79a3f7ac49c5282c81601a8251c5d0d9f8388a797b"} Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.707056 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73617c757367186938aac79a3f7ac49c5282c81601a8251c5d0d9f8388a797b" Nov 27 16:14:43 crc kubenswrapper[4707]: I1127 16:14:43.707071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.630450 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4"] Nov 27 16:14:45 crc kubenswrapper[4707]: E1127 16:14:45.630937 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerName="extract" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.630952 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerName="extract" Nov 27 16:14:45 crc kubenswrapper[4707]: E1127 16:14:45.630970 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerName="pull" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.630978 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerName="pull" Nov 27 16:14:45 crc kubenswrapper[4707]: E1127 16:14:45.630991 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerName="util" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.630999 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerName="util" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.631113 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="367eee2c-7dc3-4a7e-a943-131037e46ca1" containerName="extract" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.634419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.637453 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7xkx6" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.637772 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.638148 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.651216 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4"] Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.752278 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb4rs\" (UniqueName: \"kubernetes.io/projected/0e1c66da-d3ca-4b17-83b7-62518c83721c-kube-api-access-lb4rs\") pod \"nmstate-operator-5b5b58f5c8-v69r4\" (UID: \"0e1c66da-d3ca-4b17-83b7-62518c83721c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.853710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb4rs\" (UniqueName: \"kubernetes.io/projected/0e1c66da-d3ca-4b17-83b7-62518c83721c-kube-api-access-lb4rs\") pod \"nmstate-operator-5b5b58f5c8-v69r4\" (UID: \"0e1c66da-d3ca-4b17-83b7-62518c83721c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.882155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb4rs\" (UniqueName: \"kubernetes.io/projected/0e1c66da-d3ca-4b17-83b7-62518c83721c-kube-api-access-lb4rs\") pod \"nmstate-operator-5b5b58f5c8-v69r4\" (UID: \"0e1c66da-d3ca-4b17-83b7-62518c83721c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4" Nov 27 16:14:45 crc kubenswrapper[4707]: I1127 16:14:45.955172 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4" Nov 27 16:14:46 crc kubenswrapper[4707]: I1127 16:14:46.203788 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4"] Nov 27 16:14:46 crc kubenswrapper[4707]: W1127 16:14:46.210461 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1c66da_d3ca_4b17_83b7_62518c83721c.slice/crio-69efb2b2eefd1355a6d274de0c5ceeef77560417dcbfe321b4e98a9a345b60f2 WatchSource:0}: Error finding container 69efb2b2eefd1355a6d274de0c5ceeef77560417dcbfe321b4e98a9a345b60f2: Status 404 returned error can't find the container with id 69efb2b2eefd1355a6d274de0c5ceeef77560417dcbfe321b4e98a9a345b60f2 Nov 27 16:14:46 crc kubenswrapper[4707]: I1127 16:14:46.726934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4" event={"ID":"0e1c66da-d3ca-4b17-83b7-62518c83721c","Type":"ContainerStarted","Data":"69efb2b2eefd1355a6d274de0c5ceeef77560417dcbfe321b4e98a9a345b60f2"} Nov 27 16:14:49 crc kubenswrapper[4707]: I1127 16:14:49.746044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4" event={"ID":"0e1c66da-d3ca-4b17-83b7-62518c83721c","Type":"ContainerStarted","Data":"cc260567ed2883ee81db79e9baecc13b4755461271fc41b6fbd0005198b4d241"} Nov 27 16:14:49 crc kubenswrapper[4707]: I1127 16:14:49.763790 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-v69r4" podStartSLOduration=1.98864437 podStartE2EDuration="4.763769039s" podCreationTimestamp="2025-11-27 16:14:45 +0000 UTC" firstStartedPulling="2025-11-27 16:14:46.21463251 +0000 UTC m=+661.846081298" lastFinishedPulling="2025-11-27 16:14:48.989757199 +0000 UTC m=+664.621205967" observedRunningTime="2025-11-27 16:14:49.761200936 +0000 UTC m=+665.392649714" watchObservedRunningTime="2025-11-27 16:14:49.763769039 +0000 UTC m=+665.395217817" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.779268 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf"] Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.780830 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.784122 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-mf2vz" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.793550 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87"] Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.794051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.797541 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.804232 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf"] Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.809905 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87"] Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.822436 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-svvpq"] Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.823177 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.875439 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns"] Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.876343 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.878244 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.878423 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qp88z" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.889256 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.893002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns"] Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.923132 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44tg\" (UniqueName: \"kubernetes.io/projected/444e03f1-9114-4150-8f28-3db614bb32e0-kube-api-access-p44tg\") pod \"nmstate-metrics-7f946cbc9-zmcdf\" (UID: \"444e03f1-9114-4150-8f28-3db614bb32e0\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.923362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p69x\" (UniqueName: \"kubernetes.io/projected/d5b862a0-5504-43b2-9f8f-fa953310a52d-kube-api-access-8p69x\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.923461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-dbus-socket\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.923552 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-ovs-socket\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.923637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d051acac-59f4-434a-85bb-2cf7ec7e7107-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-h6v87\" (UID: \"d051acac-59f4-434a-85bb-2cf7ec7e7107\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.923704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-nmstate-lock\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:50 crc kubenswrapper[4707]: I1127 16:14:50.923838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69xwp\" (UniqueName: \"kubernetes.io/projected/d051acac-59f4-434a-85bb-2cf7ec7e7107-kube-api-access-69xwp\") pod \"nmstate-webhook-5f6d4c5ccb-h6v87\" (UID: \"d051acac-59f4-434a-85bb-2cf7ec7e7107\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.024982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44tg\" (UniqueName: \"kubernetes.io/projected/444e03f1-9114-4150-8f28-3db614bb32e0-kube-api-access-p44tg\") pod \"nmstate-metrics-7f946cbc9-zmcdf\" (UID: \"444e03f1-9114-4150-8f28-3db614bb32e0\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p69x\" (UniqueName: \"kubernetes.io/projected/d5b862a0-5504-43b2-9f8f-fa953310a52d-kube-api-access-8p69x\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-dbus-socket\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-ovs-socket\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025139 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d051acac-59f4-434a-85bb-2cf7ec7e7107-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-h6v87\" (UID: \"d051acac-59f4-434a-85bb-2cf7ec7e7107\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-nmstate-lock\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kx6s\" (UniqueName: \"kubernetes.io/projected/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-kube-api-access-8kx6s\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69xwp\" (UniqueName: \"kubernetes.io/projected/d051acac-59f4-434a-85bb-2cf7ec7e7107-kube-api-access-69xwp\") pod \"nmstate-webhook-5f6d4c5ccb-h6v87\" (UID: \"d051acac-59f4-434a-85bb-2cf7ec7e7107\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025749 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-ovs-socket\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.025895 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-nmstate-lock\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.026448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5b862a0-5504-43b2-9f8f-fa953310a52d-dbus-socket\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.042505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69xwp\" (UniqueName: \"kubernetes.io/projected/d051acac-59f4-434a-85bb-2cf7ec7e7107-kube-api-access-69xwp\") pod \"nmstate-webhook-5f6d4c5ccb-h6v87\" (UID: \"d051acac-59f4-434a-85bb-2cf7ec7e7107\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.045039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d051acac-59f4-434a-85bb-2cf7ec7e7107-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-h6v87\" (UID: \"d051acac-59f4-434a-85bb-2cf7ec7e7107\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.047820 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p69x\" (UniqueName: \"kubernetes.io/projected/d5b862a0-5504-43b2-9f8f-fa953310a52d-kube-api-access-8p69x\") pod \"nmstate-handler-svvpq\" (UID: \"d5b862a0-5504-43b2-9f8f-fa953310a52d\") " pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.056721 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f89ddb749-lqqp5"] Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.057332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.061386 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44tg\" (UniqueName: \"kubernetes.io/projected/444e03f1-9114-4150-8f28-3db614bb32e0-kube-api-access-p44tg\") pod \"nmstate-metrics-7f946cbc9-zmcdf\" (UID: \"444e03f1-9114-4150-8f28-3db614bb32e0\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.067268 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f89ddb749-lqqp5"] Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.113115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.123142 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.126226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.126264 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kx6s\" (UniqueName: \"kubernetes.io/projected/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-kube-api-access-8kx6s\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.126326 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.127274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: E1127 16:14:51.127587 4707 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 27 16:14:51 crc kubenswrapper[4707]: E1127 16:14:51.133895 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-plugin-serving-cert podName:2de77ffe-0aaf-4a49-86a3-3bb9a0123497 nodeName:}" failed. No retries permitted until 2025-11-27 16:14:51.633870666 +0000 UTC m=+667.265319434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-cfrns" (UID: "2de77ffe-0aaf-4a49-86a3-3bb9a0123497") : secret "plugin-serving-cert" not found Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.141719 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.145746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kx6s\" (UniqueName: \"kubernetes.io/projected/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-kube-api-access-8kx6s\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: W1127 16:14:51.161600 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5b862a0_5504_43b2_9f8f_fa953310a52d.slice/crio-d4865ed845c61a42e899ea2e2443d73ed9132deeda85b5f6cae0bfc953e105cd WatchSource:0}: Error finding container d4865ed845c61a42e899ea2e2443d73ed9132deeda85b5f6cae0bfc953e105cd: Status 404 returned error can't find the container with id d4865ed845c61a42e899ea2e2443d73ed9132deeda85b5f6cae0bfc953e105cd Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.235148 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20981388-a9e6-4ea2-8889-0189263ded6f-console-serving-cert\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.235193 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20981388-a9e6-4ea2-8889-0189263ded6f-console-oauth-config\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.235210 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-oauth-serving-cert\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.235233 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-service-ca\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.235413 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-trusted-ca-bundle\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.235484 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfmjb\" (UniqueName: \"kubernetes.io/projected/20981388-a9e6-4ea2-8889-0189263ded6f-kube-api-access-sfmjb\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.235547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-console-config\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.336674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20981388-a9e6-4ea2-8889-0189263ded6f-console-serving-cert\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.336724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20981388-a9e6-4ea2-8889-0189263ded6f-console-oauth-config\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.336743 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-oauth-serving-cert\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.336768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-service-ca\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.336788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-trusted-ca-bundle\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.336805 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfmjb\" (UniqueName: \"kubernetes.io/projected/20981388-a9e6-4ea2-8889-0189263ded6f-kube-api-access-sfmjb\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.336826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-console-config\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.337909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-console-config\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.337944 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-service-ca\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.338327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-oauth-serving-cert\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.338776 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20981388-a9e6-4ea2-8889-0189263ded6f-trusted-ca-bundle\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.340315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20981388-a9e6-4ea2-8889-0189263ded6f-console-serving-cert\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.340603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20981388-a9e6-4ea2-8889-0189263ded6f-console-oauth-config\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.357018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfmjb\" (UniqueName: \"kubernetes.io/projected/20981388-a9e6-4ea2-8889-0189263ded6f-kube-api-access-sfmjb\") pod \"console-5f89ddb749-lqqp5\" (UID: \"20981388-a9e6-4ea2-8889-0189263ded6f\") " pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.403364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.502113 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87"] Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.554394 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf"] Nov 27 16:14:51 crc kubenswrapper[4707]: W1127 16:14:51.564610 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod444e03f1_9114_4150_8f28_3db614bb32e0.slice/crio-42f55ae4287e4017f553432f0f6bd32e7c73cf1f564ac5b9473091fcdb97eec6 WatchSource:0}: Error finding container 42f55ae4287e4017f553432f0f6bd32e7c73cf1f564ac5b9473091fcdb97eec6: Status 404 returned error can't find the container with id 42f55ae4287e4017f553432f0f6bd32e7c73cf1f564ac5b9473091fcdb97eec6 Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.639905 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.645238 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2de77ffe-0aaf-4a49-86a3-3bb9a0123497-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cfrns\" (UID: \"2de77ffe-0aaf-4a49-86a3-3bb9a0123497\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.666527 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f89ddb749-lqqp5"] Nov 27 16:14:51 crc kubenswrapper[4707]: W1127 16:14:51.674358 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20981388_a9e6_4ea2_8889_0189263ded6f.slice/crio-8b3ca428b74e72c7398d19ef5fad5882b6568bc1c478c795227a9dd843f67a24 WatchSource:0}: Error finding container 8b3ca428b74e72c7398d19ef5fad5882b6568bc1c478c795227a9dd843f67a24: Status 404 returned error can't find the container with id 8b3ca428b74e72c7398d19ef5fad5882b6568bc1c478c795227a9dd843f67a24 Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.758501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" event={"ID":"d051acac-59f4-434a-85bb-2cf7ec7e7107","Type":"ContainerStarted","Data":"00ab09e5b5c4873852a0683c8e99691b711b11945221cbf6ed2efd189ed40773"} Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.759791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" event={"ID":"444e03f1-9114-4150-8f28-3db614bb32e0","Type":"ContainerStarted","Data":"42f55ae4287e4017f553432f0f6bd32e7c73cf1f564ac5b9473091fcdb97eec6"} Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.760966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f89ddb749-lqqp5" event={"ID":"20981388-a9e6-4ea2-8889-0189263ded6f","Type":"ContainerStarted","Data":"8b3ca428b74e72c7398d19ef5fad5882b6568bc1c478c795227a9dd843f67a24"} Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.762191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-svvpq" event={"ID":"d5b862a0-5504-43b2-9f8f-fa953310a52d","Type":"ContainerStarted","Data":"d4865ed845c61a42e899ea2e2443d73ed9132deeda85b5f6cae0bfc953e105cd"} Nov 27 16:14:51 crc kubenswrapper[4707]: I1127 16:14:51.790772 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" Nov 27 16:14:52 crc kubenswrapper[4707]: I1127 16:14:52.292918 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns"] Nov 27 16:14:52 crc kubenswrapper[4707]: W1127 16:14:52.304670 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de77ffe_0aaf_4a49_86a3_3bb9a0123497.slice/crio-4d7793aa3e8338bbabb0feb5805b0ea098d325131a5e1ac4dd12d0e2453e0f65 WatchSource:0}: Error finding container 4d7793aa3e8338bbabb0feb5805b0ea098d325131a5e1ac4dd12d0e2453e0f65: Status 404 returned error can't find the container with id 4d7793aa3e8338bbabb0feb5805b0ea098d325131a5e1ac4dd12d0e2453e0f65 Nov 27 16:14:52 crc kubenswrapper[4707]: I1127 16:14:52.770986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f89ddb749-lqqp5" event={"ID":"20981388-a9e6-4ea2-8889-0189263ded6f","Type":"ContainerStarted","Data":"e54a4f232a3da7be1246dde8a91c12ccc60be8ce45e13a70548752778cb82fd4"} Nov 27 16:14:52 crc kubenswrapper[4707]: I1127 16:14:52.773678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" event={"ID":"2de77ffe-0aaf-4a49-86a3-3bb9a0123497","Type":"ContainerStarted","Data":"4d7793aa3e8338bbabb0feb5805b0ea098d325131a5e1ac4dd12d0e2453e0f65"} Nov 27 16:14:52 crc kubenswrapper[4707]: I1127 16:14:52.802361 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f89ddb749-lqqp5" podStartSLOduration=1.802326608 podStartE2EDuration="1.802326608s" podCreationTimestamp="2025-11-27 16:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:14:52.800715709 +0000 UTC m=+668.432164517" watchObservedRunningTime="2025-11-27 16:14:52.802326608 +0000 UTC m=+668.433775426" Nov 27 16:14:55 crc kubenswrapper[4707]: I1127 16:14:55.796568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" event={"ID":"444e03f1-9114-4150-8f28-3db614bb32e0","Type":"ContainerStarted","Data":"5bcc83b67d8b032b7a905e83c822251c7d95907aeb90a89fc8163c9c7d7dcb3a"} Nov 27 16:14:55 crc kubenswrapper[4707]: I1127 16:14:55.799912 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-svvpq" event={"ID":"d5b862a0-5504-43b2-9f8f-fa953310a52d","Type":"ContainerStarted","Data":"530c4de27c2a3d30f712de3d8c3c8daa76c006f4cd8b68ecd3df87a9dc7cb4f4"} Nov 27 16:14:55 crc kubenswrapper[4707]: I1127 16:14:55.800092 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:14:55 crc kubenswrapper[4707]: I1127 16:14:55.802776 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" event={"ID":"d051acac-59f4-434a-85bb-2cf7ec7e7107","Type":"ContainerStarted","Data":"3e95dbdbfb2770b7c8ae490e11ea66b6ba3e8fa7e258c9951667c25503a61203"} Nov 27 16:14:55 crc kubenswrapper[4707]: I1127 16:14:55.802981 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:14:55 crc kubenswrapper[4707]: I1127 16:14:55.804926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" event={"ID":"2de77ffe-0aaf-4a49-86a3-3bb9a0123497","Type":"ContainerStarted","Data":"6b941f85736d0b4f5bdf01f26a756723cbef85482196f79c3322e0fe1c10449f"} Nov 27 16:14:55 crc kubenswrapper[4707]: I1127 16:14:55.825307 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-svvpq" podStartSLOduration=2.333078962 podStartE2EDuration="5.825282145s" podCreationTimestamp="2025-11-27 16:14:50 +0000 UTC" firstStartedPulling="2025-11-27 16:14:51.16422641 +0000 UTC m=+666.795675178" lastFinishedPulling="2025-11-27 16:14:54.656429573 +0000 UTC m=+670.287878361" observedRunningTime="2025-11-27 16:14:55.818009517 +0000 UTC m=+671.449458325" watchObservedRunningTime="2025-11-27 16:14:55.825282145 +0000 UTC m=+671.456730953" Nov 27 16:14:55 crc kubenswrapper[4707]: I1127 16:14:55.846929 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" podStartSLOduration=2.740259857 podStartE2EDuration="5.846906646s" podCreationTimestamp="2025-11-27 16:14:50 +0000 UTC" firstStartedPulling="2025-11-27 16:14:51.526659377 +0000 UTC m=+667.158108185" lastFinishedPulling="2025-11-27 16:14:54.633306196 +0000 UTC m=+670.264754974" observedRunningTime="2025-11-27 16:14:55.841487413 +0000 UTC m=+671.472936211" watchObservedRunningTime="2025-11-27 16:14:55.846906646 +0000 UTC m=+671.478355414" Nov 27 16:14:58 crc kubenswrapper[4707]: I1127 16:14:58.830545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" event={"ID":"444e03f1-9114-4150-8f28-3db614bb32e0","Type":"ContainerStarted","Data":"ac7d692967ba237a86ca46bcc827fd1e86bd3b953216ca8678c1495821bc099d"} Nov 27 16:14:58 crc kubenswrapper[4707]: I1127 16:14:58.856485 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cfrns" podStartSLOduration=6.533249696 podStartE2EDuration="8.856460714s" podCreationTimestamp="2025-11-27 16:14:50 +0000 UTC" firstStartedPulling="2025-11-27 16:14:52.308468348 +0000 UTC m=+667.939917156" lastFinishedPulling="2025-11-27 16:14:54.631679366 +0000 UTC m=+670.263128174" observedRunningTime="2025-11-27 16:14:55.872531984 +0000 UTC m=+671.503980762" watchObservedRunningTime="2025-11-27 16:14:58.856460714 +0000 UTC m=+674.487909522" Nov 27 16:14:58 crc kubenswrapper[4707]: I1127 16:14:58.859157 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-zmcdf" podStartSLOduration=2.399951731 podStartE2EDuration="8.859146479s" podCreationTimestamp="2025-11-27 16:14:50 +0000 UTC" firstStartedPulling="2025-11-27 16:14:51.567272363 +0000 UTC m=+667.198721121" lastFinishedPulling="2025-11-27 16:14:58.026467071 +0000 UTC m=+673.657915869" observedRunningTime="2025-11-27 16:14:58.856557586 +0000 UTC m=+674.488006394" watchObservedRunningTime="2025-11-27 16:14:58.859146479 +0000 UTC m=+674.490595287" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.179332 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8"] Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.181066 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.184490 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.184867 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.195890 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8"] Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.260009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6sfz\" (UniqueName: \"kubernetes.io/projected/4de6811b-581b-4c53-a730-bc307193878c-kube-api-access-v6sfz\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.260088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4de6811b-581b-4c53-a730-bc307193878c-secret-volume\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.260360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4de6811b-581b-4c53-a730-bc307193878c-config-volume\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.361472 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4de6811b-581b-4c53-a730-bc307193878c-config-volume\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.361656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6sfz\" (UniqueName: \"kubernetes.io/projected/4de6811b-581b-4c53-a730-bc307193878c-kube-api-access-v6sfz\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.361699 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4de6811b-581b-4c53-a730-bc307193878c-secret-volume\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.365056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4de6811b-581b-4c53-a730-bc307193878c-config-volume\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.372862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4de6811b-581b-4c53-a730-bc307193878c-secret-volume\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.391513 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6sfz\" (UniqueName: \"kubernetes.io/projected/4de6811b-581b-4c53-a730-bc307193878c-kube-api-access-v6sfz\") pod \"collect-profiles-29404335-sgbd8\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.540766 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.810415 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8"] Nov 27 16:15:00 crc kubenswrapper[4707]: W1127 16:15:00.823623 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de6811b_581b_4c53_a730_bc307193878c.slice/crio-d0ffb429dec04cf110a741c36d674638874bc84944e91ab3fa45ba647a0989f8 WatchSource:0}: Error finding container d0ffb429dec04cf110a741c36d674638874bc84944e91ab3fa45ba647a0989f8: Status 404 returned error can't find the container with id d0ffb429dec04cf110a741c36d674638874bc84944e91ab3fa45ba647a0989f8 Nov 27 16:15:00 crc kubenswrapper[4707]: I1127 16:15:00.845529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" event={"ID":"4de6811b-581b-4c53-a730-bc307193878c","Type":"ContainerStarted","Data":"d0ffb429dec04cf110a741c36d674638874bc84944e91ab3fa45ba647a0989f8"} Nov 27 16:15:01 crc kubenswrapper[4707]: I1127 16:15:01.179069 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-svvpq" Nov 27 16:15:01 crc kubenswrapper[4707]: I1127 16:15:01.404996 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:15:01 crc kubenswrapper[4707]: I1127 16:15:01.405706 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:15:01 crc kubenswrapper[4707]: I1127 16:15:01.410430 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:15:01 crc kubenswrapper[4707]: I1127 16:15:01.859222 4707 generic.go:334] "Generic (PLEG): container finished" podID="4de6811b-581b-4c53-a730-bc307193878c" containerID="0450cc571faa30b7a6197a322deeea816f75a547912e9c35dab835a6a2e2632e" exitCode=0 Nov 27 16:15:01 crc kubenswrapper[4707]: I1127 16:15:01.859332 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" event={"ID":"4de6811b-581b-4c53-a730-bc307193878c","Type":"ContainerDied","Data":"0450cc571faa30b7a6197a322deeea816f75a547912e9c35dab835a6a2e2632e"} Nov 27 16:15:01 crc kubenswrapper[4707]: I1127 16:15:01.867952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f89ddb749-lqqp5" Nov 27 16:15:01 crc kubenswrapper[4707]: I1127 16:15:01.955792 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5r8tf"] Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.168257 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.300615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4de6811b-581b-4c53-a730-bc307193878c-secret-volume\") pod \"4de6811b-581b-4c53-a730-bc307193878c\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.300763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6sfz\" (UniqueName: \"kubernetes.io/projected/4de6811b-581b-4c53-a730-bc307193878c-kube-api-access-v6sfz\") pod \"4de6811b-581b-4c53-a730-bc307193878c\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.300849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4de6811b-581b-4c53-a730-bc307193878c-config-volume\") pod \"4de6811b-581b-4c53-a730-bc307193878c\" (UID: \"4de6811b-581b-4c53-a730-bc307193878c\") " Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.302232 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4de6811b-581b-4c53-a730-bc307193878c-config-volume" (OuterVolumeSpecName: "config-volume") pod "4de6811b-581b-4c53-a730-bc307193878c" (UID: "4de6811b-581b-4c53-a730-bc307193878c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.307635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de6811b-581b-4c53-a730-bc307193878c-kube-api-access-v6sfz" (OuterVolumeSpecName: "kube-api-access-v6sfz") pod "4de6811b-581b-4c53-a730-bc307193878c" (UID: "4de6811b-581b-4c53-a730-bc307193878c"). InnerVolumeSpecName "kube-api-access-v6sfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.308240 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de6811b-581b-4c53-a730-bc307193878c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4de6811b-581b-4c53-a730-bc307193878c" (UID: "4de6811b-581b-4c53-a730-bc307193878c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.401958 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4de6811b-581b-4c53-a730-bc307193878c-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.402003 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6sfz\" (UniqueName: \"kubernetes.io/projected/4de6811b-581b-4c53-a730-bc307193878c-kube-api-access-v6sfz\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.402015 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4de6811b-581b-4c53-a730-bc307193878c-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.874603 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.874630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8" event={"ID":"4de6811b-581b-4c53-a730-bc307193878c","Type":"ContainerDied","Data":"d0ffb429dec04cf110a741c36d674638874bc84944e91ab3fa45ba647a0989f8"} Nov 27 16:15:03 crc kubenswrapper[4707]: I1127 16:15:03.874677 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0ffb429dec04cf110a741c36d674638874bc84944e91ab3fa45ba647a0989f8" Nov 27 16:15:11 crc kubenswrapper[4707]: I1127 16:15:11.132094 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-h6v87" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.335681 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh"] Nov 27 16:15:26 crc kubenswrapper[4707]: E1127 16:15:26.336216 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de6811b-581b-4c53-a730-bc307193878c" containerName="collect-profiles" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.336227 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de6811b-581b-4c53-a730-bc307193878c" containerName="collect-profiles" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.336328 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de6811b-581b-4c53-a730-bc307193878c" containerName="collect-profiles" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.336998 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.339252 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.350728 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh"] Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.354654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n7cp\" (UniqueName: \"kubernetes.io/projected/b9fee02a-26ff-4676-843d-0159e9b2fe91-kube-api-access-5n7cp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.354701 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.354726 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.455846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n7cp\" (UniqueName: \"kubernetes.io/projected/b9fee02a-26ff-4676-843d-0159e9b2fe91-kube-api-access-5n7cp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.455916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.455952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.456407 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.456471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.472795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n7cp\" (UniqueName: \"kubernetes.io/projected/b9fee02a-26ff-4676-843d-0159e9b2fe91-kube-api-access-5n7cp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.656880 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:26 crc kubenswrapper[4707]: I1127 16:15:26.933815 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh"] Nov 27 16:15:26 crc kubenswrapper[4707]: W1127 16:15:26.942984 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9fee02a_26ff_4676_843d_0159e9b2fe91.slice/crio-4c08eda60adc7ae7cebdf999848d8a3bb2ea78f54988e27f2c07acb2a43c0773 WatchSource:0}: Error finding container 4c08eda60adc7ae7cebdf999848d8a3bb2ea78f54988e27f2c07acb2a43c0773: Status 404 returned error can't find the container with id 4c08eda60adc7ae7cebdf999848d8a3bb2ea78f54988e27f2c07acb2a43c0773 Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.015171 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-5r8tf" podUID="36bf60c9-93cb-431f-9df1-1d3e245c49ef" containerName="console" containerID="cri-o://383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3" gracePeriod=15 Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.064409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" event={"ID":"b9fee02a-26ff-4676-843d-0159e9b2fe91","Type":"ContainerStarted","Data":"4c08eda60adc7ae7cebdf999848d8a3bb2ea78f54988e27f2c07acb2a43c0773"} Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.431580 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5r8tf_36bf60c9-93cb-431f-9df1-1d3e245c49ef/console/0.log" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.431945 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.491898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-serving-cert\") pod \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.491996 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-service-ca\") pod \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.492132 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjhz\" (UniqueName: \"kubernetes.io/projected/36bf60c9-93cb-431f-9df1-1d3e245c49ef-kube-api-access-sqjhz\") pod \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.492173 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-trusted-ca-bundle\") pod \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.492230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-config\") pod \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.492279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-oauth-serving-cert\") pod \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.492315 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-oauth-config\") pod \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\" (UID: \"36bf60c9-93cb-431f-9df1-1d3e245c49ef\") " Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.493826 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "36bf60c9-93cb-431f-9df1-1d3e245c49ef" (UID: "36bf60c9-93cb-431f-9df1-1d3e245c49ef"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.494405 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-service-ca" (OuterVolumeSpecName: "service-ca") pod "36bf60c9-93cb-431f-9df1-1d3e245c49ef" (UID: "36bf60c9-93cb-431f-9df1-1d3e245c49ef"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.499737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36bf60c9-93cb-431f-9df1-1d3e245c49ef-kube-api-access-sqjhz" (OuterVolumeSpecName: "kube-api-access-sqjhz") pod "36bf60c9-93cb-431f-9df1-1d3e245c49ef" (UID: "36bf60c9-93cb-431f-9df1-1d3e245c49ef"). InnerVolumeSpecName "kube-api-access-sqjhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.500167 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "36bf60c9-93cb-431f-9df1-1d3e245c49ef" (UID: "36bf60c9-93cb-431f-9df1-1d3e245c49ef"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.500293 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "36bf60c9-93cb-431f-9df1-1d3e245c49ef" (UID: "36bf60c9-93cb-431f-9df1-1d3e245c49ef"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.500344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-config" (OuterVolumeSpecName: "console-config") pod "36bf60c9-93cb-431f-9df1-1d3e245c49ef" (UID: "36bf60c9-93cb-431f-9df1-1d3e245c49ef"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.500866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "36bf60c9-93cb-431f-9df1-1d3e245c49ef" (UID: "36bf60c9-93cb-431f-9df1-1d3e245c49ef"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.594261 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjhz\" (UniqueName: \"kubernetes.io/projected/36bf60c9-93cb-431f-9df1-1d3e245c49ef-kube-api-access-sqjhz\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.594339 4707 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.594360 4707 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.594403 4707 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.594424 4707 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.594442 4707 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36bf60c9-93cb-431f-9df1-1d3e245c49ef-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:27 crc kubenswrapper[4707]: I1127 16:15:27.594458 4707 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36bf60c9-93cb-431f-9df1-1d3e245c49ef-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.075939 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5r8tf_36bf60c9-93cb-431f-9df1-1d3e245c49ef/console/0.log" Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.076007 4707 generic.go:334] "Generic (PLEG): container finished" podID="36bf60c9-93cb-431f-9df1-1d3e245c49ef" containerID="383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3" exitCode=2 Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.076082 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5r8tf" event={"ID":"36bf60c9-93cb-431f-9df1-1d3e245c49ef","Type":"ContainerDied","Data":"383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3"} Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.076154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5r8tf" event={"ID":"36bf60c9-93cb-431f-9df1-1d3e245c49ef","Type":"ContainerDied","Data":"dfe0d7511f231419b5c61009b684627e9e719a6dbddfe2c4f90121d653871f84"} Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.076185 4707 scope.go:117] "RemoveContainer" containerID="383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3" Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.076336 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5r8tf" Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.090978 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerID="c59c133499fd5ad959ea6b0ebe0302505800b3f0fe0ab0a04568f699320edee5" exitCode=0 Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.091052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" event={"ID":"b9fee02a-26ff-4676-843d-0159e9b2fe91","Type":"ContainerDied","Data":"c59c133499fd5ad959ea6b0ebe0302505800b3f0fe0ab0a04568f699320edee5"} Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.109443 4707 scope.go:117] "RemoveContainer" containerID="383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3" Nov 27 16:15:28 crc kubenswrapper[4707]: E1127 16:15:28.110596 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3\": container with ID starting with 383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3 not found: ID does not exist" containerID="383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3" Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.110639 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3"} err="failed to get container status \"383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3\": rpc error: code = NotFound desc = could not find container \"383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3\": container with ID starting with 383edc5f3edba7d08269ed1ee8aeeacb7a71662692e94293f1ddaf4723a723d3 not found: ID does not exist" Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.137918 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5r8tf"] Nov 27 16:15:28 crc kubenswrapper[4707]: I1127 16:15:28.146314 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-5r8tf"] Nov 27 16:15:29 crc kubenswrapper[4707]: I1127 16:15:29.208139 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36bf60c9-93cb-431f-9df1-1d3e245c49ef" path="/var/lib/kubelet/pods/36bf60c9-93cb-431f-9df1-1d3e245c49ef/volumes" Nov 27 16:15:32 crc kubenswrapper[4707]: I1127 16:15:32.129017 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerID="faf923db492471a840a994fdd7691304a8257f30a6da225045e9d881dd77ecc2" exitCode=0 Nov 27 16:15:32 crc kubenswrapper[4707]: I1127 16:15:32.129158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" event={"ID":"b9fee02a-26ff-4676-843d-0159e9b2fe91","Type":"ContainerDied","Data":"faf923db492471a840a994fdd7691304a8257f30a6da225045e9d881dd77ecc2"} Nov 27 16:15:33 crc kubenswrapper[4707]: I1127 16:15:33.144579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" event={"ID":"b9fee02a-26ff-4676-843d-0159e9b2fe91","Type":"ContainerStarted","Data":"1985eb2d73fdad51ca0fa79e0f29a0712c785d774979bf704acb01fd4f1db616"} Nov 27 16:15:33 crc kubenswrapper[4707]: I1127 16:15:33.624539 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:15:33 crc kubenswrapper[4707]: I1127 16:15:33.624619 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:15:34 crc kubenswrapper[4707]: I1127 16:15:34.156017 4707 generic.go:334] "Generic (PLEG): container finished" podID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerID="1985eb2d73fdad51ca0fa79e0f29a0712c785d774979bf704acb01fd4f1db616" exitCode=0 Nov 27 16:15:34 crc kubenswrapper[4707]: I1127 16:15:34.156089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" event={"ID":"b9fee02a-26ff-4676-843d-0159e9b2fe91","Type":"ContainerDied","Data":"1985eb2d73fdad51ca0fa79e0f29a0712c785d774979bf704acb01fd4f1db616"} Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.472339 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.506341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-util\") pod \"b9fee02a-26ff-4676-843d-0159e9b2fe91\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.506481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n7cp\" (UniqueName: \"kubernetes.io/projected/b9fee02a-26ff-4676-843d-0159e9b2fe91-kube-api-access-5n7cp\") pod \"b9fee02a-26ff-4676-843d-0159e9b2fe91\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.506549 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-bundle\") pod \"b9fee02a-26ff-4676-843d-0159e9b2fe91\" (UID: \"b9fee02a-26ff-4676-843d-0159e9b2fe91\") " Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.510033 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-bundle" (OuterVolumeSpecName: "bundle") pod "b9fee02a-26ff-4676-843d-0159e9b2fe91" (UID: "b9fee02a-26ff-4676-843d-0159e9b2fe91"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.517598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9fee02a-26ff-4676-843d-0159e9b2fe91-kube-api-access-5n7cp" (OuterVolumeSpecName: "kube-api-access-5n7cp") pod "b9fee02a-26ff-4676-843d-0159e9b2fe91" (UID: "b9fee02a-26ff-4676-843d-0159e9b2fe91"). InnerVolumeSpecName "kube-api-access-5n7cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.530116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-util" (OuterVolumeSpecName: "util") pod "b9fee02a-26ff-4676-843d-0159e9b2fe91" (UID: "b9fee02a-26ff-4676-843d-0159e9b2fe91"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.608066 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-util\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.608097 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n7cp\" (UniqueName: \"kubernetes.io/projected/b9fee02a-26ff-4676-843d-0159e9b2fe91-kube-api-access-5n7cp\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:35 crc kubenswrapper[4707]: I1127 16:15:35.608111 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9fee02a-26ff-4676-843d-0159e9b2fe91-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:15:36 crc kubenswrapper[4707]: I1127 16:15:36.199264 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" event={"ID":"b9fee02a-26ff-4676-843d-0159e9b2fe91","Type":"ContainerDied","Data":"4c08eda60adc7ae7cebdf999848d8a3bb2ea78f54988e27f2c07acb2a43c0773"} Nov 27 16:15:36 crc kubenswrapper[4707]: I1127 16:15:36.199330 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh" Nov 27 16:15:36 crc kubenswrapper[4707]: I1127 16:15:36.199343 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c08eda60adc7ae7cebdf999848d8a3bb2ea78f54988e27f2c07acb2a43c0773" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.365510 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6"] Nov 27 16:15:45 crc kubenswrapper[4707]: E1127 16:15:45.366455 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36bf60c9-93cb-431f-9df1-1d3e245c49ef" containerName="console" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.366484 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bf60c9-93cb-431f-9df1-1d3e245c49ef" containerName="console" Nov 27 16:15:45 crc kubenswrapper[4707]: E1127 16:15:45.366506 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerName="pull" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.366513 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerName="pull" Nov 27 16:15:45 crc kubenswrapper[4707]: E1127 16:15:45.366537 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerName="extract" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.366543 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerName="extract" Nov 27 16:15:45 crc kubenswrapper[4707]: E1127 16:15:45.366562 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerName="util" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.366567 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerName="util" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.366809 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9fee02a-26ff-4676-843d-0159e9b2fe91" containerName="extract" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.366825 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="36bf60c9-93cb-431f-9df1-1d3e245c49ef" containerName="console" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.367669 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.378145 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.378332 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-gm8m2" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.378398 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.378577 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.378673 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.396536 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6"] Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.437474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f84476-2107-4226-be6c-cdcc6380a697-webhook-cert\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.437546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nkb8\" (UniqueName: \"kubernetes.io/projected/f3f84476-2107-4226-be6c-cdcc6380a697-kube-api-access-9nkb8\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.437590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f84476-2107-4226-be6c-cdcc6380a697-apiservice-cert\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.538594 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nkb8\" (UniqueName: \"kubernetes.io/projected/f3f84476-2107-4226-be6c-cdcc6380a697-kube-api-access-9nkb8\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.538656 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f84476-2107-4226-be6c-cdcc6380a697-apiservice-cert\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.538689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f84476-2107-4226-be6c-cdcc6380a697-webhook-cert\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.545000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f84476-2107-4226-be6c-cdcc6380a697-webhook-cert\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.557153 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f84476-2107-4226-be6c-cdcc6380a697-apiservice-cert\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.562040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nkb8\" (UniqueName: \"kubernetes.io/projected/f3f84476-2107-4226-be6c-cdcc6380a697-kube-api-access-9nkb8\") pod \"metallb-operator-controller-manager-669b86894d-bdzj6\" (UID: \"f3f84476-2107-4226-be6c-cdcc6380a697\") " pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.702710 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t"] Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.703391 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.704933 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.705140 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.705975 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5jbcq" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.715754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t"] Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.739938 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.741103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3226579b-b878-4494-83ca-cc7288089a7a-apiservice-cert\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.741165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7h6b\" (UniqueName: \"kubernetes.io/projected/3226579b-b878-4494-83ca-cc7288089a7a-kube-api-access-s7h6b\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.741307 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3226579b-b878-4494-83ca-cc7288089a7a-webhook-cert\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.842188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3226579b-b878-4494-83ca-cc7288089a7a-apiservice-cert\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.842228 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h6b\" (UniqueName: \"kubernetes.io/projected/3226579b-b878-4494-83ca-cc7288089a7a-kube-api-access-s7h6b\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.842310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3226579b-b878-4494-83ca-cc7288089a7a-webhook-cert\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.845465 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3226579b-b878-4494-83ca-cc7288089a7a-webhook-cert\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.845945 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3226579b-b878-4494-83ca-cc7288089a7a-apiservice-cert\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.879622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7h6b\" (UniqueName: \"kubernetes.io/projected/3226579b-b878-4494-83ca-cc7288089a7a-kube-api-access-s7h6b\") pod \"metallb-operator-webhook-server-7bddc5f445-p9w4t\" (UID: \"3226579b-b878-4494-83ca-cc7288089a7a\") " pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:45 crc kubenswrapper[4707]: I1127 16:15:45.924338 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6"] Nov 27 16:15:45 crc kubenswrapper[4707]: W1127 16:15:45.936410 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3f84476_2107_4226_be6c_cdcc6380a697.slice/crio-4869453fb5162b4d58a357cbc42ec1cf9c4c11e7cfbded5a2edc8a218cb55cff WatchSource:0}: Error finding container 4869453fb5162b4d58a357cbc42ec1cf9c4c11e7cfbded5a2edc8a218cb55cff: Status 404 returned error can't find the container with id 4869453fb5162b4d58a357cbc42ec1cf9c4c11e7cfbded5a2edc8a218cb55cff Nov 27 16:15:46 crc kubenswrapper[4707]: I1127 16:15:46.014992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:46 crc kubenswrapper[4707]: I1127 16:15:46.263691 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t"] Nov 27 16:15:46 crc kubenswrapper[4707]: W1127 16:15:46.267139 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3226579b_b878_4494_83ca_cc7288089a7a.slice/crio-d0fd464296a944e3bb9108cf5c963fa046b633e149afffc03b012095bfb6d5d7 WatchSource:0}: Error finding container d0fd464296a944e3bb9108cf5c963fa046b633e149afffc03b012095bfb6d5d7: Status 404 returned error can't find the container with id d0fd464296a944e3bb9108cf5c963fa046b633e149afffc03b012095bfb6d5d7 Nov 27 16:15:46 crc kubenswrapper[4707]: I1127 16:15:46.267173 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" event={"ID":"f3f84476-2107-4226-be6c-cdcc6380a697","Type":"ContainerStarted","Data":"4869453fb5162b4d58a357cbc42ec1cf9c4c11e7cfbded5a2edc8a218cb55cff"} Nov 27 16:15:47 crc kubenswrapper[4707]: I1127 16:15:47.275873 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" event={"ID":"3226579b-b878-4494-83ca-cc7288089a7a","Type":"ContainerStarted","Data":"d0fd464296a944e3bb9108cf5c963fa046b633e149afffc03b012095bfb6d5d7"} Nov 27 16:15:49 crc kubenswrapper[4707]: I1127 16:15:49.290136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" event={"ID":"f3f84476-2107-4226-be6c-cdcc6380a697","Type":"ContainerStarted","Data":"c72db8a79afb95c51beadb0ccf7ec57c042770ff121942c7e9fb42c4b22ecf95"} Nov 27 16:15:49 crc kubenswrapper[4707]: I1127 16:15:49.290466 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:15:49 crc kubenswrapper[4707]: I1127 16:15:49.313005 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" podStartSLOduration=1.383844309 podStartE2EDuration="4.31299058s" podCreationTimestamp="2025-11-27 16:15:45 +0000 UTC" firstStartedPulling="2025-11-27 16:15:45.938700873 +0000 UTC m=+721.570149641" lastFinishedPulling="2025-11-27 16:15:48.867847144 +0000 UTC m=+724.499295912" observedRunningTime="2025-11-27 16:15:49.308754666 +0000 UTC m=+724.940203444" watchObservedRunningTime="2025-11-27 16:15:49.31299058 +0000 UTC m=+724.944439348" Nov 27 16:15:51 crc kubenswrapper[4707]: I1127 16:15:51.304153 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" event={"ID":"3226579b-b878-4494-83ca-cc7288089a7a","Type":"ContainerStarted","Data":"68e269ddece5610157e9a5fffcbe2f1ea5834b62e1e9ba1e002fbfaddab8c05c"} Nov 27 16:15:51 crc kubenswrapper[4707]: I1127 16:15:51.305723 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:15:51 crc kubenswrapper[4707]: I1127 16:15:51.327711 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" podStartSLOduration=1.699374348 podStartE2EDuration="6.327687318s" podCreationTimestamp="2025-11-27 16:15:45 +0000 UTC" firstStartedPulling="2025-11-27 16:15:46.272410008 +0000 UTC m=+721.903858786" lastFinishedPulling="2025-11-27 16:15:50.900722988 +0000 UTC m=+726.532171756" observedRunningTime="2025-11-27 16:15:51.325779621 +0000 UTC m=+726.957228399" watchObservedRunningTime="2025-11-27 16:15:51.327687318 +0000 UTC m=+726.959136106" Nov 27 16:16:03 crc kubenswrapper[4707]: I1127 16:16:03.624029 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:16:03 crc kubenswrapper[4707]: I1127 16:16:03.625509 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:16:06 crc kubenswrapper[4707]: I1127 16:16:06.019058 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7bddc5f445-p9w4t" Nov 27 16:16:20 crc kubenswrapper[4707]: I1127 16:16:20.991099 4707 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 16:16:25 crc kubenswrapper[4707]: I1127 16:16:25.744185 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-669b86894d-bdzj6" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.535690 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-l5lzc"] Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.539507 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.548591 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.549391 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.549605 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-h9p4k" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.555630 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc"] Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.559066 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.564340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.583636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc"] Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.640676 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-m8h9q"] Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.641637 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.645865 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.646042 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t7k8b" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.646157 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.646307 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.659164 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-fcnd7"] Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.660161 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.662191 4707 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.670479 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-fcnd7"] Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.670967 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfd8369-d85e-41e1-8990-123e0de5e7d4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-krwxc\" (UID: \"0dfd8369-d85e-41e1-8990-123e0de5e7d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.671000 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-frr-conf\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.671029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-reloader\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.671056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-frr-sockets\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.671162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-metrics\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.671191 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f892e6ab-ac16-4945-b724-ddca8efed111-frr-startup\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.671216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnsj5\" (UniqueName: \"kubernetes.io/projected/f892e6ab-ac16-4945-b724-ddca8efed111-kube-api-access-nnsj5\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.671320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f892e6ab-ac16-4945-b724-ddca8efed111-metrics-certs\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.671428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rftxj\" (UniqueName: \"kubernetes.io/projected/0dfd8369-d85e-41e1-8990-123e0de5e7d4-kube-api-access-rftxj\") pod \"frr-k8s-webhook-server-7fcb986d4-krwxc\" (UID: \"0dfd8369-d85e-41e1-8990-123e0de5e7d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.772926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfd8369-d85e-41e1-8990-123e0de5e7d4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-krwxc\" (UID: \"0dfd8369-d85e-41e1-8990-123e0de5e7d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.772988 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-frr-conf\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773025 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-metrics-certs\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: E1127 16:16:26.773076 4707 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 27 16:16:26 crc kubenswrapper[4707]: E1127 16:16:26.773136 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dfd8369-d85e-41e1-8990-123e0de5e7d4-cert podName:0dfd8369-d85e-41e1-8990-123e0de5e7d4 nodeName:}" failed. No retries permitted until 2025-11-27 16:16:27.273118961 +0000 UTC m=+762.904567729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dfd8369-d85e-41e1-8990-123e0de5e7d4-cert") pod "frr-k8s-webhook-server-7fcb986d4-krwxc" (UID: "0dfd8369-d85e-41e1-8990-123e0de5e7d4") : secret "frr-k8s-webhook-server-cert" not found Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-reloader\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld8h6\" (UniqueName: \"kubernetes.io/projected/8c251e95-c920-4c32-b7be-95367e79b151-kube-api-access-ld8h6\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773325 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92pps\" (UniqueName: \"kubernetes.io/projected/0b63951d-0fad-479e-9a1d-e3978d75f5db-kube-api-access-92pps\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773360 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8c251e95-c920-4c32-b7be-95367e79b151-metallb-excludel2\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-frr-sockets\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773508 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-metrics\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-frr-conf\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773528 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f892e6ab-ac16-4945-b724-ddca8efed111-frr-startup\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773551 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnsj5\" (UniqueName: \"kubernetes.io/projected/f892e6ab-ac16-4945-b724-ddca8efed111-kube-api-access-nnsj5\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773610 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b63951d-0fad-479e-9a1d-e3978d75f5db-metrics-certs\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773617 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-reloader\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773643 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f892e6ab-ac16-4945-b724-ddca8efed111-metrics-certs\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rftxj\" (UniqueName: \"kubernetes.io/projected/0dfd8369-d85e-41e1-8990-123e0de5e7d4-kube-api-access-rftxj\") pod \"frr-k8s-webhook-server-7fcb986d4-krwxc\" (UID: \"0dfd8369-d85e-41e1-8990-123e0de5e7d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b63951d-0fad-479e-9a1d-e3978d75f5db-cert\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773822 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-frr-sockets\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.773839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f892e6ab-ac16-4945-b724-ddca8efed111-metrics\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.774267 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f892e6ab-ac16-4945-b724-ddca8efed111-frr-startup\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.791354 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f892e6ab-ac16-4945-b724-ddca8efed111-metrics-certs\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.794262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rftxj\" (UniqueName: \"kubernetes.io/projected/0dfd8369-d85e-41e1-8990-123e0de5e7d4-kube-api-access-rftxj\") pod \"frr-k8s-webhook-server-7fcb986d4-krwxc\" (UID: \"0dfd8369-d85e-41e1-8990-123e0de5e7d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.794685 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnsj5\" (UniqueName: \"kubernetes.io/projected/f892e6ab-ac16-4945-b724-ddca8efed111-kube-api-access-nnsj5\") pod \"frr-k8s-l5lzc\" (UID: \"f892e6ab-ac16-4945-b724-ddca8efed111\") " pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.861310 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.875358 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b63951d-0fad-479e-9a1d-e3978d75f5db-cert\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.875432 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-metrics-certs\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.875469 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld8h6\" (UniqueName: \"kubernetes.io/projected/8c251e95-c920-4c32-b7be-95367e79b151-kube-api-access-ld8h6\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.875499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92pps\" (UniqueName: \"kubernetes.io/projected/0b63951d-0fad-479e-9a1d-e3978d75f5db-kube-api-access-92pps\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.875520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8c251e95-c920-4c32-b7be-95367e79b151-metallb-excludel2\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.875574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b63951d-0fad-479e-9a1d-e3978d75f5db-metrics-certs\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.875600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: E1127 16:16:26.875711 4707 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 27 16:16:26 crc kubenswrapper[4707]: E1127 16:16:26.875762 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist podName:8c251e95-c920-4c32-b7be-95367e79b151 nodeName:}" failed. No retries permitted until 2025-11-27 16:16:27.375746096 +0000 UTC m=+763.007194864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist") pod "speaker-m8h9q" (UID: "8c251e95-c920-4c32-b7be-95367e79b151") : secret "metallb-memberlist" not found Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.877481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8c251e95-c920-4c32-b7be-95367e79b151-metallb-excludel2\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.880155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b63951d-0fad-479e-9a1d-e3978d75f5db-cert\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.880441 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-metrics-certs\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.880696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b63951d-0fad-479e-9a1d-e3978d75f5db-metrics-certs\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.898237 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld8h6\" (UniqueName: \"kubernetes.io/projected/8c251e95-c920-4c32-b7be-95367e79b151-kube-api-access-ld8h6\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.908961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92pps\" (UniqueName: \"kubernetes.io/projected/0b63951d-0fad-479e-9a1d-e3978d75f5db-kube-api-access-92pps\") pod \"controller-f8648f98b-fcnd7\" (UID: \"0b63951d-0fad-479e-9a1d-e3978d75f5db\") " pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:26 crc kubenswrapper[4707]: I1127 16:16:26.976684 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.220339 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-fcnd7"] Nov 27 16:16:27 crc kubenswrapper[4707]: W1127 16:16:27.228152 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b63951d_0fad_479e_9a1d_e3978d75f5db.slice/crio-8204911f3de10233233786281269a851e36c4fb49c5f02675867c9999009e95b WatchSource:0}: Error finding container 8204911f3de10233233786281269a851e36c4fb49c5f02675867c9999009e95b: Status 404 returned error can't find the container with id 8204911f3de10233233786281269a851e36c4fb49c5f02675867c9999009e95b Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.281764 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfd8369-d85e-41e1-8990-123e0de5e7d4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-krwxc\" (UID: \"0dfd8369-d85e-41e1-8990-123e0de5e7d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.289326 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dfd8369-d85e-41e1-8990-123e0de5e7d4-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-krwxc\" (UID: \"0dfd8369-d85e-41e1-8990-123e0de5e7d4\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.382950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:27 crc kubenswrapper[4707]: E1127 16:16:27.383221 4707 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 27 16:16:27 crc kubenswrapper[4707]: E1127 16:16:27.383804 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist podName:8c251e95-c920-4c32-b7be-95367e79b151 nodeName:}" failed. No retries permitted until 2025-11-27 16:16:28.383649458 +0000 UTC m=+764.015098266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist") pod "speaker-m8h9q" (UID: "8c251e95-c920-4c32-b7be-95367e79b151") : secret "metallb-memberlist" not found Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.478960 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.565637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-fcnd7" event={"ID":"0b63951d-0fad-479e-9a1d-e3978d75f5db","Type":"ContainerStarted","Data":"c25990a18e23147dde06aff2dddd004ab6a4f9fe718cca8ecef365f6cad9e716"} Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.565679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-fcnd7" event={"ID":"0b63951d-0fad-479e-9a1d-e3978d75f5db","Type":"ContainerStarted","Data":"c7ca788b9ca2f16fe2c07181a4184819df1b4387211225a1c89b17ee297ebd5e"} Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.565691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-fcnd7" event={"ID":"0b63951d-0fad-479e-9a1d-e3978d75f5db","Type":"ContainerStarted","Data":"8204911f3de10233233786281269a851e36c4fb49c5f02675867c9999009e95b"} Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.565735 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.568156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerStarted","Data":"596c41f89154802feea5805c3613496a1bd645c48b41fad16e6c01afe321d370"} Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.601214 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-fcnd7" podStartSLOduration=1.601196178 podStartE2EDuration="1.601196178s" podCreationTimestamp="2025-11-27 16:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:16:27.596253847 +0000 UTC m=+763.227702625" watchObservedRunningTime="2025-11-27 16:16:27.601196178 +0000 UTC m=+763.232644956" Nov 27 16:16:27 crc kubenswrapper[4707]: I1127 16:16:27.737789 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc"] Nov 27 16:16:28 crc kubenswrapper[4707]: I1127 16:16:28.399626 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:28 crc kubenswrapper[4707]: I1127 16:16:28.403922 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8c251e95-c920-4c32-b7be-95367e79b151-memberlist\") pod \"speaker-m8h9q\" (UID: \"8c251e95-c920-4c32-b7be-95367e79b151\") " pod="metallb-system/speaker-m8h9q" Nov 27 16:16:28 crc kubenswrapper[4707]: I1127 16:16:28.459409 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m8h9q" Nov 27 16:16:28 crc kubenswrapper[4707]: W1127 16:16:28.479107 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c251e95_c920_4c32_b7be_95367e79b151.slice/crio-81cc38853d71dfd32c76fbcafceea49e72784d6987f591995d492806812db034 WatchSource:0}: Error finding container 81cc38853d71dfd32c76fbcafceea49e72784d6987f591995d492806812db034: Status 404 returned error can't find the container with id 81cc38853d71dfd32c76fbcafceea49e72784d6987f591995d492806812db034 Nov 27 16:16:28 crc kubenswrapper[4707]: I1127 16:16:28.575331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" event={"ID":"0dfd8369-d85e-41e1-8990-123e0de5e7d4","Type":"ContainerStarted","Data":"6b0fbb514638782019abe357358b23035d08459a80bd5d7330e076544805f16f"} Nov 27 16:16:28 crc kubenswrapper[4707]: I1127 16:16:28.576422 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m8h9q" event={"ID":"8c251e95-c920-4c32-b7be-95367e79b151","Type":"ContainerStarted","Data":"81cc38853d71dfd32c76fbcafceea49e72784d6987f591995d492806812db034"} Nov 27 16:16:29 crc kubenswrapper[4707]: I1127 16:16:29.582998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m8h9q" event={"ID":"8c251e95-c920-4c32-b7be-95367e79b151","Type":"ContainerStarted","Data":"a7bb4bdf821d417ef3accd4b1e55ea221715005ac408027a1045f61a98241510"} Nov 27 16:16:29 crc kubenswrapper[4707]: I1127 16:16:29.583244 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m8h9q" event={"ID":"8c251e95-c920-4c32-b7be-95367e79b151","Type":"ContainerStarted","Data":"c16cf3fd0b5618eef9664be8a2950208a94eb64c6a1f03b02937cb507c39a629"} Nov 27 16:16:29 crc kubenswrapper[4707]: I1127 16:16:29.583280 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-m8h9q" Nov 27 16:16:29 crc kubenswrapper[4707]: I1127 16:16:29.600353 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-m8h9q" podStartSLOduration=3.600337896 podStartE2EDuration="3.600337896s" podCreationTimestamp="2025-11-27 16:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:16:29.596362368 +0000 UTC m=+765.227811136" watchObservedRunningTime="2025-11-27 16:16:29.600337896 +0000 UTC m=+765.231786664" Nov 27 16:16:33 crc kubenswrapper[4707]: I1127 16:16:33.628078 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:16:33 crc kubenswrapper[4707]: I1127 16:16:33.628466 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:16:33 crc kubenswrapper[4707]: I1127 16:16:33.628529 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:16:33 crc kubenswrapper[4707]: I1127 16:16:33.629224 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"596e4dd118b814e12fbaccbb4655af72f02c2baaf706c8463cf822841fdaa729"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:16:33 crc kubenswrapper[4707]: I1127 16:16:33.629308 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://596e4dd118b814e12fbaccbb4655af72f02c2baaf706c8463cf822841fdaa729" gracePeriod=600 Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.621918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" event={"ID":"0dfd8369-d85e-41e1-8990-123e0de5e7d4","Type":"ContainerStarted","Data":"59e48c064f2839f4a37561ae7a0fae623b3f72c5fa1921f046ad94eea9993f76"} Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.622517 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.624516 4707 generic.go:334] "Generic (PLEG): container finished" podID="f892e6ab-ac16-4945-b724-ddca8efed111" containerID="63280d51d979c1f599ef519928c3d23d3ef26ad75177d1d857da2695bd2c59a2" exitCode=0 Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.624583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerDied","Data":"63280d51d979c1f599ef519928c3d23d3ef26ad75177d1d857da2695bd2c59a2"} Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.628275 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="596e4dd118b814e12fbaccbb4655af72f02c2baaf706c8463cf822841fdaa729" exitCode=0 Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.628325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"596e4dd118b814e12fbaccbb4655af72f02c2baaf706c8463cf822841fdaa729"} Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.628388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"8aa0ec55553e2030c537e5b750cef10ee68d7cb3cbe0ae6f95e1e594b84cdc37"} Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.628417 4707 scope.go:117] "RemoveContainer" containerID="3756a089c2c481a4f6e191b772b187f75506926ec4370423509a5187881583f9" Nov 27 16:16:34 crc kubenswrapper[4707]: I1127 16:16:34.641074 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" podStartSLOduration=2.2353210629999998 podStartE2EDuration="8.641051558s" podCreationTimestamp="2025-11-27 16:16:26 +0000 UTC" firstStartedPulling="2025-11-27 16:16:27.751822268 +0000 UTC m=+763.383271046" lastFinishedPulling="2025-11-27 16:16:34.157552723 +0000 UTC m=+769.789001541" observedRunningTime="2025-11-27 16:16:34.639186542 +0000 UTC m=+770.270635350" watchObservedRunningTime="2025-11-27 16:16:34.641051558 +0000 UTC m=+770.272500336" Nov 27 16:16:35 crc kubenswrapper[4707]: I1127 16:16:35.641860 4707 generic.go:334] "Generic (PLEG): container finished" podID="f892e6ab-ac16-4945-b724-ddca8efed111" containerID="a5a56f27cfe24de3c7ae58f49082396053c3719cf99eea1ccc372eee429fff8a" exitCode=0 Nov 27 16:16:35 crc kubenswrapper[4707]: I1127 16:16:35.642939 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerDied","Data":"a5a56f27cfe24de3c7ae58f49082396053c3719cf99eea1ccc372eee429fff8a"} Nov 27 16:16:36 crc kubenswrapper[4707]: I1127 16:16:36.652614 4707 generic.go:334] "Generic (PLEG): container finished" podID="f892e6ab-ac16-4945-b724-ddca8efed111" containerID="d6999f7f6e5ecdc11e48f0353b2ac7457291f8426316981cb827a3460c7c3430" exitCode=0 Nov 27 16:16:36 crc kubenswrapper[4707]: I1127 16:16:36.652752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerDied","Data":"d6999f7f6e5ecdc11e48f0353b2ac7457291f8426316981cb827a3460c7c3430"} Nov 27 16:16:37 crc kubenswrapper[4707]: I1127 16:16:37.665748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerStarted","Data":"847f9de6a8513d74ffcb04076b72d1d5ceca58b31971a4791debf46965ffd83d"} Nov 27 16:16:37 crc kubenswrapper[4707]: I1127 16:16:37.666163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerStarted","Data":"a488389865e2377596b49d0eb762eb99011ee07cfe0859122a0dfd2b91a1ef71"} Nov 27 16:16:37 crc kubenswrapper[4707]: I1127 16:16:37.666178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerStarted","Data":"73939c11600a20549b717c1c74835588dacc217e2eabb18e28d03c4d3537da16"} Nov 27 16:16:37 crc kubenswrapper[4707]: I1127 16:16:37.666189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerStarted","Data":"405b6293b793262535dba8c159c075515923691b9ed8e87940041a4bec1b80a4"} Nov 27 16:16:37 crc kubenswrapper[4707]: I1127 16:16:37.666200 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerStarted","Data":"8d1a3f91d38c74ab48331a351e288e81d7659d644627f1f503d58d2bc30fcc9b"} Nov 27 16:16:38 crc kubenswrapper[4707]: I1127 16:16:38.465028 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-m8h9q" Nov 27 16:16:38 crc kubenswrapper[4707]: I1127 16:16:38.692034 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l5lzc" event={"ID":"f892e6ab-ac16-4945-b724-ddca8efed111","Type":"ContainerStarted","Data":"21862481fc0e9213de5516cc879824b03ef471648166dff6f7a98dbc452879fd"} Nov 27 16:16:38 crc kubenswrapper[4707]: I1127 16:16:38.693239 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.670431 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-l5lzc" podStartSLOduration=8.516618933 podStartE2EDuration="15.670403604s" podCreationTimestamp="2025-11-27 16:16:26 +0000 UTC" firstStartedPulling="2025-11-27 16:16:27.024670584 +0000 UTC m=+762.656119352" lastFinishedPulling="2025-11-27 16:16:34.178455215 +0000 UTC m=+769.809904023" observedRunningTime="2025-11-27 16:16:38.732392582 +0000 UTC m=+774.363841390" watchObservedRunningTime="2025-11-27 16:16:41.670403604 +0000 UTC m=+777.301852412" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.678771 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4cdj4"] Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.679939 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4cdj4" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.683658 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6d8x5" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.683927 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.683989 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.697098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4cdj4"] Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.812471 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7vr7\" (UniqueName: \"kubernetes.io/projected/5db6fc93-164b-45b9-a14a-701822aaf6a8-kube-api-access-p7vr7\") pod \"openstack-operator-index-4cdj4\" (UID: \"5db6fc93-164b-45b9-a14a-701822aaf6a8\") " pod="openstack-operators/openstack-operator-index-4cdj4" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.862548 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.904797 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.914002 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7vr7\" (UniqueName: \"kubernetes.io/projected/5db6fc93-164b-45b9-a14a-701822aaf6a8-kube-api-access-p7vr7\") pod \"openstack-operator-index-4cdj4\" (UID: \"5db6fc93-164b-45b9-a14a-701822aaf6a8\") " pod="openstack-operators/openstack-operator-index-4cdj4" Nov 27 16:16:41 crc kubenswrapper[4707]: I1127 16:16:41.945239 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7vr7\" (UniqueName: \"kubernetes.io/projected/5db6fc93-164b-45b9-a14a-701822aaf6a8-kube-api-access-p7vr7\") pod \"openstack-operator-index-4cdj4\" (UID: \"5db6fc93-164b-45b9-a14a-701822aaf6a8\") " pod="openstack-operators/openstack-operator-index-4cdj4" Nov 27 16:16:42 crc kubenswrapper[4707]: I1127 16:16:42.002295 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4cdj4" Nov 27 16:16:42 crc kubenswrapper[4707]: I1127 16:16:42.539595 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4cdj4"] Nov 27 16:16:42 crc kubenswrapper[4707]: W1127 16:16:42.543524 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5db6fc93_164b_45b9_a14a_701822aaf6a8.slice/crio-80cca7404b7ba5a133790a3bd6d1f28ef4869c8192e8998be1882330faac620e WatchSource:0}: Error finding container 80cca7404b7ba5a133790a3bd6d1f28ef4869c8192e8998be1882330faac620e: Status 404 returned error can't find the container with id 80cca7404b7ba5a133790a3bd6d1f28ef4869c8192e8998be1882330faac620e Nov 27 16:16:42 crc kubenswrapper[4707]: I1127 16:16:42.720732 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4cdj4" event={"ID":"5db6fc93-164b-45b9-a14a-701822aaf6a8","Type":"ContainerStarted","Data":"80cca7404b7ba5a133790a3bd6d1f28ef4869c8192e8998be1882330faac620e"} Nov 27 16:16:45 crc kubenswrapper[4707]: I1127 16:16:45.031872 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-4cdj4"] Nov 27 16:16:45 crc kubenswrapper[4707]: I1127 16:16:45.651791 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qdn2b"] Nov 27 16:16:45 crc kubenswrapper[4707]: I1127 16:16:45.653196 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:16:45 crc kubenswrapper[4707]: I1127 16:16:45.661798 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qdn2b"] Nov 27 16:16:45 crc kubenswrapper[4707]: I1127 16:16:45.766936 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-762bn\" (UniqueName: \"kubernetes.io/projected/4b76b0a5-e84e-427d-9cb4-4fac9969a278-kube-api-access-762bn\") pod \"openstack-operator-index-qdn2b\" (UID: \"4b76b0a5-e84e-427d-9cb4-4fac9969a278\") " pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:16:45 crc kubenswrapper[4707]: I1127 16:16:45.867882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-762bn\" (UniqueName: \"kubernetes.io/projected/4b76b0a5-e84e-427d-9cb4-4fac9969a278-kube-api-access-762bn\") pod \"openstack-operator-index-qdn2b\" (UID: \"4b76b0a5-e84e-427d-9cb4-4fac9969a278\") " pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:16:45 crc kubenswrapper[4707]: I1127 16:16:45.910081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-762bn\" (UniqueName: \"kubernetes.io/projected/4b76b0a5-e84e-427d-9cb4-4fac9969a278-kube-api-access-762bn\") pod \"openstack-operator-index-qdn2b\" (UID: \"4b76b0a5-e84e-427d-9cb4-4fac9969a278\") " pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:16:45 crc kubenswrapper[4707]: I1127 16:16:45.988057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:16:46 crc kubenswrapper[4707]: I1127 16:16:46.865317 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-l5lzc" Nov 27 16:16:46 crc kubenswrapper[4707]: I1127 16:16:46.980649 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-fcnd7" Nov 27 16:16:47 crc kubenswrapper[4707]: I1127 16:16:47.228857 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qdn2b"] Nov 27 16:16:47 crc kubenswrapper[4707]: W1127 16:16:47.240607 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b76b0a5_e84e_427d_9cb4_4fac9969a278.slice/crio-b702c1c4fafbbcd20be7b7625875d45ea453051695a78236162b7fd3c1d495d7 WatchSource:0}: Error finding container b702c1c4fafbbcd20be7b7625875d45ea453051695a78236162b7fd3c1d495d7: Status 404 returned error can't find the container with id b702c1c4fafbbcd20be7b7625875d45ea453051695a78236162b7fd3c1d495d7 Nov 27 16:16:47 crc kubenswrapper[4707]: I1127 16:16:47.491787 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-krwxc" Nov 27 16:16:47 crc kubenswrapper[4707]: I1127 16:16:47.759505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4cdj4" event={"ID":"5db6fc93-164b-45b9-a14a-701822aaf6a8","Type":"ContainerStarted","Data":"98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50"} Nov 27 16:16:47 crc kubenswrapper[4707]: I1127 16:16:47.759995 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-4cdj4" podUID="5db6fc93-164b-45b9-a14a-701822aaf6a8" containerName="registry-server" containerID="cri-o://98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50" gracePeriod=2 Nov 27 16:16:47 crc kubenswrapper[4707]: I1127 16:16:47.761177 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qdn2b" event={"ID":"4b76b0a5-e84e-427d-9cb4-4fac9969a278","Type":"ContainerStarted","Data":"9df69953738d8552c87cda37d3c23fed756e0fc774346cebf7d21328ad04cd0c"} Nov 27 16:16:47 crc kubenswrapper[4707]: I1127 16:16:47.761228 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qdn2b" event={"ID":"4b76b0a5-e84e-427d-9cb4-4fac9969a278","Type":"ContainerStarted","Data":"b702c1c4fafbbcd20be7b7625875d45ea453051695a78236162b7fd3c1d495d7"} Nov 27 16:16:47 crc kubenswrapper[4707]: I1127 16:16:47.782976 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4cdj4" podStartSLOduration=2.583347409 podStartE2EDuration="6.782957961s" podCreationTimestamp="2025-11-27 16:16:41 +0000 UTC" firstStartedPulling="2025-11-27 16:16:42.546311808 +0000 UTC m=+778.177760606" lastFinishedPulling="2025-11-27 16:16:46.74592238 +0000 UTC m=+782.377371158" observedRunningTime="2025-11-27 16:16:47.779565788 +0000 UTC m=+783.411014636" watchObservedRunningTime="2025-11-27 16:16:47.782957961 +0000 UTC m=+783.414406729" Nov 27 16:16:47 crc kubenswrapper[4707]: I1127 16:16:47.807616 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qdn2b" podStartSLOduration=2.74577703 podStartE2EDuration="2.807583304s" podCreationTimestamp="2025-11-27 16:16:45 +0000 UTC" firstStartedPulling="2025-11-27 16:16:47.245763513 +0000 UTC m=+782.877212311" lastFinishedPulling="2025-11-27 16:16:47.307569787 +0000 UTC m=+782.939018585" observedRunningTime="2025-11-27 16:16:47.800108111 +0000 UTC m=+783.431556879" watchObservedRunningTime="2025-11-27 16:16:47.807583304 +0000 UTC m=+783.439032112" Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.204974 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4cdj4" Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.304999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7vr7\" (UniqueName: \"kubernetes.io/projected/5db6fc93-164b-45b9-a14a-701822aaf6a8-kube-api-access-p7vr7\") pod \"5db6fc93-164b-45b9-a14a-701822aaf6a8\" (UID: \"5db6fc93-164b-45b9-a14a-701822aaf6a8\") " Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.311980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db6fc93-164b-45b9-a14a-701822aaf6a8-kube-api-access-p7vr7" (OuterVolumeSpecName: "kube-api-access-p7vr7") pod "5db6fc93-164b-45b9-a14a-701822aaf6a8" (UID: "5db6fc93-164b-45b9-a14a-701822aaf6a8"). InnerVolumeSpecName "kube-api-access-p7vr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.406478 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7vr7\" (UniqueName: \"kubernetes.io/projected/5db6fc93-164b-45b9-a14a-701822aaf6a8-kube-api-access-p7vr7\") on node \"crc\" DevicePath \"\"" Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.774809 4707 generic.go:334] "Generic (PLEG): container finished" podID="5db6fc93-164b-45b9-a14a-701822aaf6a8" containerID="98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50" exitCode=0 Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.774938 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4cdj4" Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.775006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4cdj4" event={"ID":"5db6fc93-164b-45b9-a14a-701822aaf6a8","Type":"ContainerDied","Data":"98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50"} Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.775049 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4cdj4" event={"ID":"5db6fc93-164b-45b9-a14a-701822aaf6a8","Type":"ContainerDied","Data":"80cca7404b7ba5a133790a3bd6d1f28ef4869c8192e8998be1882330faac620e"} Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.775079 4707 scope.go:117] "RemoveContainer" containerID="98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50" Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.805606 4707 scope.go:117] "RemoveContainer" containerID="98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50" Nov 27 16:16:48 crc kubenswrapper[4707]: E1127 16:16:48.817208 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50\": container with ID starting with 98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50 not found: ID does not exist" containerID="98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50" Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.817274 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50"} err="failed to get container status \"98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50\": rpc error: code = NotFound desc = could not find container \"98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50\": container with ID starting with 98408a0aae3eccf36b7dbf6d477043503f13161063f81ca1eb6ec52b0be84c50 not found: ID does not exist" Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.830255 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-4cdj4"] Nov 27 16:16:48 crc kubenswrapper[4707]: I1127 16:16:48.841588 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-4cdj4"] Nov 27 16:16:49 crc kubenswrapper[4707]: I1127 16:16:49.209901 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db6fc93-164b-45b9-a14a-701822aaf6a8" path="/var/lib/kubelet/pods/5db6fc93-164b-45b9-a14a-701822aaf6a8/volumes" Nov 27 16:16:55 crc kubenswrapper[4707]: I1127 16:16:55.989013 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:16:55 crc kubenswrapper[4707]: I1127 16:16:55.989820 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:16:56 crc kubenswrapper[4707]: I1127 16:16:56.031183 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:16:56 crc kubenswrapper[4707]: I1127 16:16:56.886737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-qdn2b" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.691888 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh"] Nov 27 16:17:01 crc kubenswrapper[4707]: E1127 16:17:01.692631 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db6fc93-164b-45b9-a14a-701822aaf6a8" containerName="registry-server" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.692653 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db6fc93-164b-45b9-a14a-701822aaf6a8" containerName="registry-server" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.692878 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db6fc93-164b-45b9-a14a-701822aaf6a8" containerName="registry-server" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.694231 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.701194 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-x2m7m" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.706785 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh"] Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.718732 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-util\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.718926 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-bundle\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.719192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmfhm\" (UniqueName: \"kubernetes.io/projected/82db51a9-76e7-4066-9dbc-83b27ff84cc8-kube-api-access-kmfhm\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.821590 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-bundle\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.822187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmfhm\" (UniqueName: \"kubernetes.io/projected/82db51a9-76e7-4066-9dbc-83b27ff84cc8-kube-api-access-kmfhm\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.822402 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-util\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.822519 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-bundle\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.823353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-util\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:01 crc kubenswrapper[4707]: I1127 16:17:01.858145 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmfhm\" (UniqueName: \"kubernetes.io/projected/82db51a9-76e7-4066-9dbc-83b27ff84cc8-kube-api-access-kmfhm\") pod \"9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:02 crc kubenswrapper[4707]: I1127 16:17:02.021160 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:02 crc kubenswrapper[4707]: I1127 16:17:02.548696 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh"] Nov 27 16:17:02 crc kubenswrapper[4707]: I1127 16:17:02.890150 4707 generic.go:334] "Generic (PLEG): container finished" podID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerID="5714397b0080ca22d12db0453853fec169055b5e218bb02e81a65501247b89d0" exitCode=0 Nov 27 16:17:02 crc kubenswrapper[4707]: I1127 16:17:02.890221 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" event={"ID":"82db51a9-76e7-4066-9dbc-83b27ff84cc8","Type":"ContainerDied","Data":"5714397b0080ca22d12db0453853fec169055b5e218bb02e81a65501247b89d0"} Nov 27 16:17:02 crc kubenswrapper[4707]: I1127 16:17:02.890269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" event={"ID":"82db51a9-76e7-4066-9dbc-83b27ff84cc8","Type":"ContainerStarted","Data":"b6b63c94a1c90f5da962bd31dda1297d2a621796645e26ecf9cf52f180490db5"} Nov 27 16:17:03 crc kubenswrapper[4707]: I1127 16:17:03.902088 4707 generic.go:334] "Generic (PLEG): container finished" podID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerID="e0c18d5366d8978cd6528d3d356eaf930e4fa72ffda2e2ad072f5b36bedf0093" exitCode=0 Nov 27 16:17:03 crc kubenswrapper[4707]: I1127 16:17:03.902163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" event={"ID":"82db51a9-76e7-4066-9dbc-83b27ff84cc8","Type":"ContainerDied","Data":"e0c18d5366d8978cd6528d3d356eaf930e4fa72ffda2e2ad072f5b36bedf0093"} Nov 27 16:17:04 crc kubenswrapper[4707]: I1127 16:17:04.914659 4707 generic.go:334] "Generic (PLEG): container finished" podID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerID="ed64f40316e04e5adf7862679c5def9c393a7cbb3961cd67dc1a1fd639bf7dd7" exitCode=0 Nov 27 16:17:04 crc kubenswrapper[4707]: I1127 16:17:04.914769 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" event={"ID":"82db51a9-76e7-4066-9dbc-83b27ff84cc8","Type":"ContainerDied","Data":"ed64f40316e04e5adf7862679c5def9c393a7cbb3961cd67dc1a1fd639bf7dd7"} Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.319818 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.392836 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmfhm\" (UniqueName: \"kubernetes.io/projected/82db51a9-76e7-4066-9dbc-83b27ff84cc8-kube-api-access-kmfhm\") pod \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.392956 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-util\") pod \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.393046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-bundle\") pod \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\" (UID: \"82db51a9-76e7-4066-9dbc-83b27ff84cc8\") " Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.394538 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-bundle" (OuterVolumeSpecName: "bundle") pod "82db51a9-76e7-4066-9dbc-83b27ff84cc8" (UID: "82db51a9-76e7-4066-9dbc-83b27ff84cc8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.401094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82db51a9-76e7-4066-9dbc-83b27ff84cc8-kube-api-access-kmfhm" (OuterVolumeSpecName: "kube-api-access-kmfhm") pod "82db51a9-76e7-4066-9dbc-83b27ff84cc8" (UID: "82db51a9-76e7-4066-9dbc-83b27ff84cc8"). InnerVolumeSpecName "kube-api-access-kmfhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.423328 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-util" (OuterVolumeSpecName: "util") pod "82db51a9-76e7-4066-9dbc-83b27ff84cc8" (UID: "82db51a9-76e7-4066-9dbc-83b27ff84cc8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.495833 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-util\") on node \"crc\" DevicePath \"\"" Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.495910 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/82db51a9-76e7-4066-9dbc-83b27ff84cc8-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.495938 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmfhm\" (UniqueName: \"kubernetes.io/projected/82db51a9-76e7-4066-9dbc-83b27ff84cc8-kube-api-access-kmfhm\") on node \"crc\" DevicePath \"\"" Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.934493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" event={"ID":"82db51a9-76e7-4066-9dbc-83b27ff84cc8","Type":"ContainerDied","Data":"b6b63c94a1c90f5da962bd31dda1297d2a621796645e26ecf9cf52f180490db5"} Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.934536 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b63c94a1c90f5da962bd31dda1297d2a621796645e26ecf9cf52f180490db5" Nov 27 16:17:06 crc kubenswrapper[4707]: I1127 16:17:06.934579 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.792000 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c"] Nov 27 16:17:08 crc kubenswrapper[4707]: E1127 16:17:08.793869 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerName="pull" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.794011 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerName="pull" Nov 27 16:17:08 crc kubenswrapper[4707]: E1127 16:17:08.794682 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerName="util" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.794830 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerName="util" Nov 27 16:17:08 crc kubenswrapper[4707]: E1127 16:17:08.794953 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerName="extract" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.795087 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerName="extract" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.795507 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="82db51a9-76e7-4066-9dbc-83b27ff84cc8" containerName="extract" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.796286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.798155 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-l24p8" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.816607 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c"] Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.829443 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hx48\" (UniqueName: \"kubernetes.io/projected/b517a267-0265-4e24-b102-b19b8d9eee18-kube-api-access-2hx48\") pod \"openstack-operator-controller-operator-7d7f8454cc-87c8c\" (UID: \"b517a267-0265-4e24-b102-b19b8d9eee18\") " pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.930687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hx48\" (UniqueName: \"kubernetes.io/projected/b517a267-0265-4e24-b102-b19b8d9eee18-kube-api-access-2hx48\") pod \"openstack-operator-controller-operator-7d7f8454cc-87c8c\" (UID: \"b517a267-0265-4e24-b102-b19b8d9eee18\") " pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" Nov 27 16:17:08 crc kubenswrapper[4707]: I1127 16:17:08.978091 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hx48\" (UniqueName: \"kubernetes.io/projected/b517a267-0265-4e24-b102-b19b8d9eee18-kube-api-access-2hx48\") pod \"openstack-operator-controller-operator-7d7f8454cc-87c8c\" (UID: \"b517a267-0265-4e24-b102-b19b8d9eee18\") " pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" Nov 27 16:17:09 crc kubenswrapper[4707]: I1127 16:17:09.122286 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" Nov 27 16:17:09 crc kubenswrapper[4707]: I1127 16:17:09.338884 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c"] Nov 27 16:17:09 crc kubenswrapper[4707]: W1127 16:17:09.343261 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb517a267_0265_4e24_b102_b19b8d9eee18.slice/crio-1898ec6b52cbb0d40bfa41ecd0bb2fb841dd033215b1bf0f5cbdd69ca34ec7a4 WatchSource:0}: Error finding container 1898ec6b52cbb0d40bfa41ecd0bb2fb841dd033215b1bf0f5cbdd69ca34ec7a4: Status 404 returned error can't find the container with id 1898ec6b52cbb0d40bfa41ecd0bb2fb841dd033215b1bf0f5cbdd69ca34ec7a4 Nov 27 16:17:09 crc kubenswrapper[4707]: I1127 16:17:09.953357 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" event={"ID":"b517a267-0265-4e24-b102-b19b8d9eee18","Type":"ContainerStarted","Data":"1898ec6b52cbb0d40bfa41ecd0bb2fb841dd033215b1bf0f5cbdd69ca34ec7a4"} Nov 27 16:17:13 crc kubenswrapper[4707]: I1127 16:17:13.988523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" event={"ID":"b517a267-0265-4e24-b102-b19b8d9eee18","Type":"ContainerStarted","Data":"2500c1fa0b1d32140e64f820e3d15108dfee8b1447be727842a9923bb0a341b4"} Nov 27 16:17:13 crc kubenswrapper[4707]: I1127 16:17:13.989330 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" Nov 27 16:17:14 crc kubenswrapper[4707]: I1127 16:17:14.041941 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" podStartSLOduration=2.338337047 podStartE2EDuration="6.04191694s" podCreationTimestamp="2025-11-27 16:17:08 +0000 UTC" firstStartedPulling="2025-11-27 16:17:09.346164156 +0000 UTC m=+804.977612924" lastFinishedPulling="2025-11-27 16:17:13.049744009 +0000 UTC m=+808.681192817" observedRunningTime="2025-11-27 16:17:14.035905683 +0000 UTC m=+809.667354481" watchObservedRunningTime="2025-11-27 16:17:14.04191694 +0000 UTC m=+809.673365738" Nov 27 16:17:19 crc kubenswrapper[4707]: I1127 16:17:19.126168 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7d7f8454cc-87c8c" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.063939 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.065580 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.067773 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fv2mx" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.093046 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-f5g2d"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.093971 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.095869 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-l28s6" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.102552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.113603 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.114768 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.120005 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-f5g2d"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.125144 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-62t2m" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.128501 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.129543 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.131922 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xnjpv" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.138135 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.156525 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.158954 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.160217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.165792 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-9s296" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.171540 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.172448 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.177220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vxdsk" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.177330 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.185122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897tg\" (UniqueName: \"kubernetes.io/projected/0267ac3a-4bee-42b9-a506-e2b1e1e3726e-kube-api-access-897tg\") pod \"barbican-operator-controller-manager-7b64f4fb85-nhkfx\" (UID: \"0267ac3a-4bee-42b9-a506-e2b1e1e3726e\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.206438 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.206473 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.207312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.210742 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.210925 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-2dmwr" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.211600 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.215543 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-pb5ps" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.226734 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.236881 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.249941 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.258426 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.259617 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.273257 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cz5kp" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.287155 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-897tg\" (UniqueName: \"kubernetes.io/projected/0267ac3a-4bee-42b9-a506-e2b1e1e3726e-kube-api-access-897tg\") pod \"barbican-operator-controller-manager-7b64f4fb85-nhkfx\" (UID: \"0267ac3a-4bee-42b9-a506-e2b1e1e3726e\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.287915 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvlbb\" (UniqueName: \"kubernetes.io/projected/6d78cbb1-56e6-428d-bba4-5d1edbbda363-kube-api-access-mvlbb\") pod \"cinder-operator-controller-manager-6b7f75547b-sgwpt\" (UID: \"6d78cbb1-56e6-428d-bba4-5d1edbbda363\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.288009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn4xd\" (UniqueName: \"kubernetes.io/projected/99785491-bcbd-4946-b1a6-a3e08a4394b5-kube-api-access-xn4xd\") pod \"glance-operator-controller-manager-589cbd6b5b-cd4lh\" (UID: \"99785491-bcbd-4946-b1a6-a3e08a4394b5\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.288105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb96r\" (UniqueName: \"kubernetes.io/projected/dc2542ce-f2fd-454b-b47f-92d3bbc93d91-kube-api-access-gb96r\") pod \"keystone-operator-controller-manager-7b4567c7cf-4cd96\" (UID: \"dc2542ce-f2fd-454b-b47f-92d3bbc93d91\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.288195 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz4fj\" (UniqueName: \"kubernetes.io/projected/88f24787-fedc-4d08-9a8e-16a24f242d02-kube-api-access-qz4fj\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.288272 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwdq\" (UniqueName: \"kubernetes.io/projected/8fea437e-0a8c-4836-b23c-56db9c7ea0fc-kube-api-access-bfwdq\") pod \"ironic-operator-controller-manager-67cb4dc6d4-5bj45\" (UID: \"8fea437e-0a8c-4836-b23c-56db9c7ea0fc\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.288349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn8cx\" (UniqueName: \"kubernetes.io/projected/5e9f9859-1f28-4183-b71c-e9459e2746b7-kube-api-access-fn8cx\") pod \"designate-operator-controller-manager-955677c94-f5g2d\" (UID: \"5e9f9859-1f28-4183-b71c-e9459e2746b7\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.288448 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j25f\" (UniqueName: \"kubernetes.io/projected/ac946592-ee39-443e-b64a-980caaace080-kube-api-access-6j25f\") pod \"horizon-operator-controller-manager-5d494799bf-hg8j4\" (UID: \"ac946592-ee39-443e-b64a-980caaace080\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.288531 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvv4d\" (UniqueName: \"kubernetes.io/projected/a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660-kube-api-access-nvv4d\") pod \"heat-operator-controller-manager-5b77f656f-8qhwz\" (UID: \"a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.288608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.329662 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.333012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-897tg\" (UniqueName: \"kubernetes.io/projected/0267ac3a-4bee-42b9-a506-e2b1e1e3726e-kube-api-access-897tg\") pod \"barbican-operator-controller-manager-7b64f4fb85-nhkfx\" (UID: \"0267ac3a-4bee-42b9-a506-e2b1e1e3726e\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.333915 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.336744 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-t49hf" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.337507 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.338490 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.341593 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.343698 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nl45f" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.347979 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.353937 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.355031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.357007 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.367592 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.368615 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xtnmp" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.368743 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.374154 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-87xfd" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.382470 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.386562 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.389632 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.389898 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvlbb\" (UniqueName: \"kubernetes.io/projected/6d78cbb1-56e6-428d-bba4-5d1edbbda363-kube-api-access-mvlbb\") pod \"cinder-operator-controller-manager-6b7f75547b-sgwpt\" (UID: \"6d78cbb1-56e6-428d-bba4-5d1edbbda363\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.389931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn4xd\" (UniqueName: \"kubernetes.io/projected/99785491-bcbd-4946-b1a6-a3e08a4394b5-kube-api-access-xn4xd\") pod \"glance-operator-controller-manager-589cbd6b5b-cd4lh\" (UID: \"99785491-bcbd-4946-b1a6-a3e08a4394b5\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.389980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb96r\" (UniqueName: \"kubernetes.io/projected/dc2542ce-f2fd-454b-b47f-92d3bbc93d91-kube-api-access-gb96r\") pod \"keystone-operator-controller-manager-7b4567c7cf-4cd96\" (UID: \"dc2542ce-f2fd-454b-b47f-92d3bbc93d91\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.390016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz4fj\" (UniqueName: \"kubernetes.io/projected/88f24787-fedc-4d08-9a8e-16a24f242d02-kube-api-access-qz4fj\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.390600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwdq\" (UniqueName: \"kubernetes.io/projected/8fea437e-0a8c-4836-b23c-56db9c7ea0fc-kube-api-access-bfwdq\") pod \"ironic-operator-controller-manager-67cb4dc6d4-5bj45\" (UID: \"8fea437e-0a8c-4836-b23c-56db9c7ea0fc\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.390624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn8cx\" (UniqueName: \"kubernetes.io/projected/5e9f9859-1f28-4183-b71c-e9459e2746b7-kube-api-access-fn8cx\") pod \"designate-operator-controller-manager-955677c94-f5g2d\" (UID: \"5e9f9859-1f28-4183-b71c-e9459e2746b7\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.390646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j25f\" (UniqueName: \"kubernetes.io/projected/ac946592-ee39-443e-b64a-980caaace080-kube-api-access-6j25f\") pod \"horizon-operator-controller-manager-5d494799bf-hg8j4\" (UID: \"ac946592-ee39-443e-b64a-980caaace080\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.390666 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvv4d\" (UniqueName: \"kubernetes.io/projected/a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660-kube-api-access-nvv4d\") pod \"heat-operator-controller-manager-5b77f656f-8qhwz\" (UID: \"a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.390683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:39 crc kubenswrapper[4707]: E1127 16:17:39.390759 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:39 crc kubenswrapper[4707]: E1127 16:17:39.390801 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert podName:88f24787-fedc-4d08-9a8e-16a24f242d02 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:39.89078567 +0000 UTC m=+835.522234438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert") pod "infra-operator-controller-manager-57548d458d-kwlbb" (UID: "88f24787-fedc-4d08-9a8e-16a24f242d02") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.396810 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.397759 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.402039 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-fdd2n" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.404405 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.407063 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn8cx\" (UniqueName: \"kubernetes.io/projected/5e9f9859-1f28-4183-b71c-e9459e2746b7-kube-api-access-fn8cx\") pod \"designate-operator-controller-manager-955677c94-f5g2d\" (UID: \"5e9f9859-1f28-4183-b71c-e9459e2746b7\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.407963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz4fj\" (UniqueName: \"kubernetes.io/projected/88f24787-fedc-4d08-9a8e-16a24f242d02-kube-api-access-qz4fj\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.412101 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.412961 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwdq\" (UniqueName: \"kubernetes.io/projected/8fea437e-0a8c-4836-b23c-56db9c7ea0fc-kube-api-access-bfwdq\") pod \"ironic-operator-controller-manager-67cb4dc6d4-5bj45\" (UID: \"8fea437e-0a8c-4836-b23c-56db9c7ea0fc\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.413124 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.413804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb96r\" (UniqueName: \"kubernetes.io/projected/dc2542ce-f2fd-454b-b47f-92d3bbc93d91-kube-api-access-gb96r\") pod \"keystone-operator-controller-manager-7b4567c7cf-4cd96\" (UID: \"dc2542ce-f2fd-454b-b47f-92d3bbc93d91\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.414623 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn4xd\" (UniqueName: \"kubernetes.io/projected/99785491-bcbd-4946-b1a6-a3e08a4394b5-kube-api-access-xn4xd\") pod \"glance-operator-controller-manager-589cbd6b5b-cd4lh\" (UID: \"99785491-bcbd-4946-b1a6-a3e08a4394b5\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.415563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qfkcp" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.417352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvlbb\" (UniqueName: \"kubernetes.io/projected/6d78cbb1-56e6-428d-bba4-5d1edbbda363-kube-api-access-mvlbb\") pod \"cinder-operator-controller-manager-6b7f75547b-sgwpt\" (UID: \"6d78cbb1-56e6-428d-bba4-5d1edbbda363\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.420287 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j25f\" (UniqueName: \"kubernetes.io/projected/ac946592-ee39-443e-b64a-980caaace080-kube-api-access-6j25f\") pod \"horizon-operator-controller-manager-5d494799bf-hg8j4\" (UID: \"ac946592-ee39-443e-b64a-980caaace080\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.420293 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvv4d\" (UniqueName: \"kubernetes.io/projected/a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660-kube-api-access-nvv4d\") pod \"heat-operator-controller-manager-5b77f656f-8qhwz\" (UID: \"a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.420956 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.425136 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.427541 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.428894 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.428983 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.429334 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fcf6c" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.429943 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.429976 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.431569 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-gx2c2" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.432750 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.436297 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.443183 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.443541 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.444525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.447588 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-t79zs" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.450784 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.484783 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbrg\" (UniqueName: \"kubernetes.io/projected/126e10c6-2740-47fc-8331-a8e4bb6549b8-kube-api-access-zxbrg\") pod \"manila-operator-controller-manager-5d499bf58b-x5k9j\" (UID: \"126e10c6-2740-47fc-8331-a8e4bb6549b8\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493460 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfws\" (UniqueName: \"kubernetes.io/projected/20e41446-8d89-481e-bd9f-48dc14efb82e-kube-api-access-bpfws\") pod \"nova-operator-controller-manager-79556f57fc-q6vcr\" (UID: \"20e41446-8d89-481e-bd9f-48dc14efb82e\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493494 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwm9s\" (UniqueName: \"kubernetes.io/projected/6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841-kube-api-access-bwm9s\") pod \"ovn-operator-controller-manager-56897c768d-nbgp6\" (UID: \"6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493516 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx9hq\" (UniqueName: \"kubernetes.io/projected/efa00950-13dd-4e9e-a215-6ebb89006545-kube-api-access-vx9hq\") pod \"swift-operator-controller-manager-d77b94747-qdh2s\" (UID: \"efa00950-13dd-4e9e-a215-6ebb89006545\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd9ht\" (UniqueName: \"kubernetes.io/projected/3c3bf501-b545-45a2-b186-2df94990295d-kube-api-access-sd9ht\") pod \"placement-operator-controller-manager-57988cc5b5-v48r5\" (UID: \"3c3bf501-b545-45a2-b186-2df94990295d\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493604 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj89g\" (UniqueName: \"kubernetes.io/projected/7c88f676-b4d3-46b2-aedd-eff62f8f1bfb-kube-api-access-fj89g\") pod \"octavia-operator-controller-manager-64cdc6ff96-blnpn\" (UID: \"7c88f676-b4d3-46b2-aedd-eff62f8f1bfb\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493626 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxs4g\" (UniqueName: \"kubernetes.io/projected/50771ff9-4409-4f41-ad3c-98f730dbff77-kube-api-access-jxs4g\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-dfw47\" (UID: \"50771ff9-4409-4f41-ad3c-98f730dbff77\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493643 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfhb7\" (UniqueName: \"kubernetes.io/projected/7761d2b0-8cc7-4dc8-a956-df20e2efc081-kube-api-access-sfhb7\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.493666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4spzl\" (UniqueName: \"kubernetes.io/projected/9377949b-5979-44ff-bd3f-ea1389b4ef6f-kube-api-access-4spzl\") pod \"neutron-operator-controller-manager-6fdcddb789-4bqw7\" (UID: \"9377949b-5979-44ff-bd3f-ea1389b4ef6f\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.504554 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.505624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.509513 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mbz2k" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.528044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.539793 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.541540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx9hq\" (UniqueName: \"kubernetes.io/projected/efa00950-13dd-4e9e-a215-6ebb89006545-kube-api-access-vx9hq\") pod \"swift-operator-controller-manager-d77b94747-qdh2s\" (UID: \"efa00950-13dd-4e9e-a215-6ebb89006545\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611644 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd9ht\" (UniqueName: \"kubernetes.io/projected/3c3bf501-b545-45a2-b186-2df94990295d-kube-api-access-sd9ht\") pod \"placement-operator-controller-manager-57988cc5b5-v48r5\" (UID: \"3c3bf501-b545-45a2-b186-2df94990295d\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611687 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj89g\" (UniqueName: \"kubernetes.io/projected/7c88f676-b4d3-46b2-aedd-eff62f8f1bfb-kube-api-access-fj89g\") pod \"octavia-operator-controller-manager-64cdc6ff96-blnpn\" (UID: \"7c88f676-b4d3-46b2-aedd-eff62f8f1bfb\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611732 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxs4g\" (UniqueName: \"kubernetes.io/projected/50771ff9-4409-4f41-ad3c-98f730dbff77-kube-api-access-jxs4g\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-dfw47\" (UID: \"50771ff9-4409-4f41-ad3c-98f730dbff77\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfhb7\" (UniqueName: \"kubernetes.io/projected/7761d2b0-8cc7-4dc8-a956-df20e2efc081-kube-api-access-sfhb7\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:39 crc kubenswrapper[4707]: E1127 16:17:39.611776 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4spzl\" (UniqueName: \"kubernetes.io/projected/9377949b-5979-44ff-bd3f-ea1389b4ef6f-kube-api-access-4spzl\") pod \"neutron-operator-controller-manager-6fdcddb789-4bqw7\" (UID: \"9377949b-5979-44ff-bd3f-ea1389b4ef6f\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" Nov 27 16:17:39 crc kubenswrapper[4707]: E1127 16:17:39.611837 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert podName:7761d2b0-8cc7-4dc8-a956-df20e2efc081 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:40.111815814 +0000 UTC m=+835.743264582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" (UID: "7761d2b0-8cc7-4dc8-a956-df20e2efc081") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbrg\" (UniqueName: \"kubernetes.io/projected/126e10c6-2740-47fc-8331-a8e4bb6549b8-kube-api-access-zxbrg\") pod \"manila-operator-controller-manager-5d499bf58b-x5k9j\" (UID: \"126e10c6-2740-47fc-8331-a8e4bb6549b8\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfws\" (UniqueName: \"kubernetes.io/projected/20e41446-8d89-481e-bd9f-48dc14efb82e-kube-api-access-bpfws\") pod \"nova-operator-controller-manager-79556f57fc-q6vcr\" (UID: \"20e41446-8d89-481e-bd9f-48dc14efb82e\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.611964 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwm9s\" (UniqueName: \"kubernetes.io/projected/6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841-kube-api-access-bwm9s\") pod \"ovn-operator-controller-manager-56897c768d-nbgp6\" (UID: \"6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.615903 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.652760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfhb7\" (UniqueName: \"kubernetes.io/projected/7761d2b0-8cc7-4dc8-a956-df20e2efc081-kube-api-access-sfhb7\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.659108 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd9ht\" (UniqueName: \"kubernetes.io/projected/3c3bf501-b545-45a2-b186-2df94990295d-kube-api-access-sd9ht\") pod \"placement-operator-controller-manager-57988cc5b5-v48r5\" (UID: \"3c3bf501-b545-45a2-b186-2df94990295d\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.662171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj89g\" (UniqueName: \"kubernetes.io/projected/7c88f676-b4d3-46b2-aedd-eff62f8f1bfb-kube-api-access-fj89g\") pod \"octavia-operator-controller-manager-64cdc6ff96-blnpn\" (UID: \"7c88f676-b4d3-46b2-aedd-eff62f8f1bfb\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.633109 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.664816 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx9hq\" (UniqueName: \"kubernetes.io/projected/efa00950-13dd-4e9e-a215-6ebb89006545-kube-api-access-vx9hq\") pod \"swift-operator-controller-manager-d77b94747-qdh2s\" (UID: \"efa00950-13dd-4e9e-a215-6ebb89006545\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.664836 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.665230 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwm9s\" (UniqueName: \"kubernetes.io/projected/6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841-kube-api-access-bwm9s\") pod \"ovn-operator-controller-manager-56897c768d-nbgp6\" (UID: \"6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.665480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbrg\" (UniqueName: \"kubernetes.io/projected/126e10c6-2740-47fc-8331-a8e4bb6549b8-kube-api-access-zxbrg\") pod \"manila-operator-controller-manager-5d499bf58b-x5k9j\" (UID: \"126e10c6-2740-47fc-8331-a8e4bb6549b8\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.666650 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxs4g\" (UniqueName: \"kubernetes.io/projected/50771ff9-4409-4f41-ad3c-98f730dbff77-kube-api-access-jxs4g\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-dfw47\" (UID: \"50771ff9-4409-4f41-ad3c-98f730dbff77\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.668080 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-zpw66" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.669283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.674173 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfws\" (UniqueName: \"kubernetes.io/projected/20e41446-8d89-481e-bd9f-48dc14efb82e-kube-api-access-bpfws\") pod \"nova-operator-controller-manager-79556f57fc-q6vcr\" (UID: \"20e41446-8d89-481e-bd9f-48dc14efb82e\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.676681 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4spzl\" (UniqueName: \"kubernetes.io/projected/9377949b-5979-44ff-bd3f-ea1389b4ef6f-kube-api-access-4spzl\") pod \"neutron-operator-controller-manager-6fdcddb789-4bqw7\" (UID: \"9377949b-5979-44ff-bd3f-ea1389b4ef6f\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.682096 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.688710 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.695719 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.696889 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.698505 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vbvmt" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.712712 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.713884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.715239 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.720666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfcls\" (UniqueName: \"kubernetes.io/projected/94ba089c-d890-4d57-abc0-258a2b54a6f9-kube-api-access-jfcls\") pod \"telemetry-operator-controller-manager-8665cb7d49-w7xqm\" (UID: \"94ba089c-d890-4d57-abc0-258a2b54a6f9\") " pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.731895 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.819827 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.827462 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpzf4\" (UniqueName: \"kubernetes.io/projected/00e99b4b-2bbd-445a-b075-74c47fe30f79-kube-api-access-cpzf4\") pod \"watcher-operator-controller-manager-656dcb59d4-hgbtk\" (UID: \"00e99b4b-2bbd-445a-b075-74c47fe30f79\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.827545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5tzz\" (UniqueName: \"kubernetes.io/projected/97bb6c80-6996-4e91-bcdf-0f1c20e72fa3-kube-api-access-s5tzz\") pod \"test-operator-controller-manager-5cd6c7f4c8-qwb4t\" (UID: \"97bb6c80-6996-4e91-bcdf-0f1c20e72fa3\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.827601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfcls\" (UniqueName: \"kubernetes.io/projected/94ba089c-d890-4d57-abc0-258a2b54a6f9-kube-api-access-jfcls\") pod \"telemetry-operator-controller-manager-8665cb7d49-w7xqm\" (UID: \"94ba089c-d890-4d57-abc0-258a2b54a6f9\") " pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.836254 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.836350 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.840088 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.840290 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-n8k5f" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.840407 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.841115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.853923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfcls\" (UniqueName: \"kubernetes.io/projected/94ba089c-d890-4d57-abc0-258a2b54a6f9-kube-api-access-jfcls\") pod \"telemetry-operator-controller-manager-8665cb7d49-w7xqm\" (UID: \"94ba089c-d890-4d57-abc0-258a2b54a6f9\") " pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.855485 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.884625 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7"] Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.885632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.893568 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-9sfs7" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.929813 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpzf4\" (UniqueName: \"kubernetes.io/projected/00e99b4b-2bbd-445a-b075-74c47fe30f79-kube-api-access-cpzf4\") pod \"watcher-operator-controller-manager-656dcb59d4-hgbtk\" (UID: \"00e99b4b-2bbd-445a-b075-74c47fe30f79\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.929890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5tzz\" (UniqueName: \"kubernetes.io/projected/97bb6c80-6996-4e91-bcdf-0f1c20e72fa3-kube-api-access-s5tzz\") pod \"test-operator-controller-manager-5cd6c7f4c8-qwb4t\" (UID: \"97bb6c80-6996-4e91-bcdf-0f1c20e72fa3\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.929919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.929964 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9jdv\" (UniqueName: \"kubernetes.io/projected/6647f986-9d62-4939-907b-fde960b30a37-kube-api-access-p9jdv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-v96x7\" (UID: \"6647f986-9d62-4939-907b-fde960b30a37\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.930030 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7"] Nov 27 16:17:39 crc kubenswrapper[4707]: E1127 16:17:39.930310 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:39 crc kubenswrapper[4707]: E1127 16:17:39.930348 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert podName:88f24787-fedc-4d08-9a8e-16a24f242d02 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:40.930334746 +0000 UTC m=+836.561783514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert") pod "infra-operator-controller-manager-57548d458d-kwlbb" (UID: "88f24787-fedc-4d08-9a8e-16a24f242d02") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.956827 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.962501 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.971103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5tzz\" (UniqueName: \"kubernetes.io/projected/97bb6c80-6996-4e91-bcdf-0f1c20e72fa3-kube-api-access-s5tzz\") pod \"test-operator-controller-manager-5cd6c7f4c8-qwb4t\" (UID: \"97bb6c80-6996-4e91-bcdf-0f1c20e72fa3\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" Nov 27 16:17:39 crc kubenswrapper[4707]: I1127 16:17:39.971624 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpzf4\" (UniqueName: \"kubernetes.io/projected/00e99b4b-2bbd-445a-b075-74c47fe30f79-kube-api-access-cpzf4\") pod \"watcher-operator-controller-manager-656dcb59d4-hgbtk\" (UID: \"00e99b4b-2bbd-445a-b075-74c47fe30f79\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.031258 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.031384 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tjqm\" (UniqueName: \"kubernetes.io/projected/894c1749-fccf-4178-b7a8-6c63e18266f6-kube-api-access-9tjqm\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.031414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9jdv\" (UniqueName: \"kubernetes.io/projected/6647f986-9d62-4939-907b-fde960b30a37-kube-api-access-p9jdv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-v96x7\" (UID: \"6647f986-9d62-4939-907b-fde960b30a37\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.031452 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.044217 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.051251 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9jdv\" (UniqueName: \"kubernetes.io/projected/6647f986-9d62-4939-907b-fde960b30a37-kube-api-access-p9jdv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-v96x7\" (UID: \"6647f986-9d62-4939-907b-fde960b30a37\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.056985 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.103030 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx"] Nov 27 16:17:40 crc kubenswrapper[4707]: W1127 16:17:40.116541 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0267ac3a_4bee_42b9_a506_e2b1e1e3726e.slice/crio-3d3bea326ef31c006a55ec5fd4eff2159f61076cead4ab1d65b7a651374f60c2 WatchSource:0}: Error finding container 3d3bea326ef31c006a55ec5fd4eff2159f61076cead4ab1d65b7a651374f60c2: Status 404 returned error can't find the container with id 3d3bea326ef31c006a55ec5fd4eff2159f61076cead4ab1d65b7a651374f60c2 Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.133258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.133916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.134012 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tjqm\" (UniqueName: \"kubernetes.io/projected/894c1749-fccf-4178-b7a8-6c63e18266f6-kube-api-access-9tjqm\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.134045 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.134082 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.134183 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.134238 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.134246 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:40.63422946 +0000 UTC m=+836.265678228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "metrics-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.134183 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.134311 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:40.634288692 +0000 UTC m=+836.265737460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "webhook-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.134415 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert podName:7761d2b0-8cc7-4dc8-a956-df20e2efc081 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:41.134391124 +0000 UTC m=+836.765839892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" (UID: "7761d2b0-8cc7-4dc8-a956-df20e2efc081") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.160963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tjqm\" (UniqueName: \"kubernetes.io/projected/894c1749-fccf-4178-b7a8-6c63e18266f6-kube-api-access-9tjqm\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.199809 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" event={"ID":"0267ac3a-4bee-42b9-a506-e2b1e1e3726e","Type":"ContainerStarted","Data":"3d3bea326ef31c006a55ec5fd4eff2159f61076cead4ab1d65b7a651374f60c2"} Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.221558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.273337 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz"] Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.278694 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh"] Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.288205 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt"] Nov 27 16:17:40 crc kubenswrapper[4707]: W1127 16:17:40.298188 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d78cbb1_56e6_428d_bba4_5d1edbbda363.slice/crio-3140571e048099653af9bf686f2f225898faf2964838888777ab24640bf5be64 WatchSource:0}: Error finding container 3140571e048099653af9bf686f2f225898faf2964838888777ab24640bf5be64: Status 404 returned error can't find the container with id 3140571e048099653af9bf686f2f225898faf2964838888777ab24640bf5be64 Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.423639 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96"] Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.428549 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-f5g2d"] Nov 27 16:17:40 crc kubenswrapper[4707]: W1127 16:17:40.429396 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9f9859_1f28_4183_b71c_e9459e2746b7.slice/crio-935f858f79eec61dea23681fffd51621cb8d7a07a1397c344ce05f2eafc07099 WatchSource:0}: Error finding container 935f858f79eec61dea23681fffd51621cb8d7a07a1397c344ce05f2eafc07099: Status 404 returned error can't find the container with id 935f858f79eec61dea23681fffd51621cb8d7a07a1397c344ce05f2eafc07099 Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.432438 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4"] Nov 27 16:17:40 crc kubenswrapper[4707]: W1127 16:17:40.433057 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc2542ce_f2fd_454b_b47f_92d3bbc93d91.slice/crio-4067d59b09fbb0e02f6b1f4c1f11c9d1e12a159fb7abc0b7473c028688f581f2 WatchSource:0}: Error finding container 4067d59b09fbb0e02f6b1f4c1f11c9d1e12a159fb7abc0b7473c028688f581f2: Status 404 returned error can't find the container with id 4067d59b09fbb0e02f6b1f4c1f11c9d1e12a159fb7abc0b7473c028688f581f2 Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.436556 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j"] Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.439938 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7"] Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.624636 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6"] Nov 27 16:17:40 crc kubenswrapper[4707]: W1127 16:17:40.629576 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aa8bbf9_4d33_4b2e_b53a_d6a341b3d841.slice/crio-ad3e4d8501ace9c56b79473089eb26c723b96cd5284125525a7ae8632134b81a WatchSource:0}: Error finding container ad3e4d8501ace9c56b79473089eb26c723b96cd5284125525a7ae8632134b81a: Status 404 returned error can't find the container with id ad3e4d8501ace9c56b79473089eb26c723b96cd5284125525a7ae8632134b81a Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.634407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr"] Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.641969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.642056 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.642183 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.642230 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:41.642215632 +0000 UTC m=+837.273664400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "webhook-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.642266 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.642283 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:41.642278184 +0000 UTC m=+837.273726942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "metrics-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.648385 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn"] Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.657147 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45"] Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.670131 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:d65dbfc956e9cf376f3c48fc3a0942cb7306b5164f898c40d1efca106df81db7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bfwdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-67cb4dc6d4-5bj45_openstack-operators(8fea437e-0a8c-4836-b23c-56db9c7ea0fc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.672930 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bfwdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-67cb4dc6d4-5bj45_openstack-operators(8fea437e-0a8c-4836-b23c-56db9c7ea0fc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.674634 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" podUID="8fea437e-0a8c-4836-b23c-56db9c7ea0fc" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.679648 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5"] Nov 27 16:17:40 crc kubenswrapper[4707]: W1127 16:17:40.687597 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3bf501_b545_45a2_b186_2df94990295d.slice/crio-1a19dcc9d759f04eedbfa12a95d1aaaa6f40d11362eb67269964898d0fb7ee1c WatchSource:0}: Error finding container 1a19dcc9d759f04eedbfa12a95d1aaaa6f40d11362eb67269964898d0fb7ee1c: Status 404 returned error can't find the container with id 1a19dcc9d759f04eedbfa12a95d1aaaa6f40d11362eb67269964898d0fb7ee1c Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.687933 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s"] Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.689992 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sd9ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57988cc5b5-v48r5_openstack-operators(3c3bf501-b545-45a2-b186-2df94990295d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.693135 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sd9ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57988cc5b5-v48r5_openstack-operators(3c3bf501-b545-45a2-b186-2df94990295d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.694309 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" podUID="3c3bf501-b545-45a2-b186-2df94990295d" Nov 27 16:17:40 crc kubenswrapper[4707]: W1127 16:17:40.694814 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefa00950_13dd_4e9e_a215_6ebb89006545.slice/crio-81903e142d34f1d144824c57a38640e2a8a070ba4e6714a5a75d564efe62541a WatchSource:0}: Error finding container 81903e142d34f1d144824c57a38640e2a8a070ba4e6714a5a75d564efe62541a: Status 404 returned error can't find the container with id 81903e142d34f1d144824c57a38640e2a8a070ba4e6714a5a75d564efe62541a Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.702095 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vx9hq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-qdh2s_openstack-operators(efa00950-13dd-4e9e-a215-6ebb89006545): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.704511 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vx9hq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-qdh2s_openstack-operators(efa00950-13dd-4e9e-a215-6ebb89006545): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.709448 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" podUID="efa00950-13dd-4e9e-a215-6ebb89006545" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.777760 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm"] Nov 27 16:17:40 crc kubenswrapper[4707]: W1127 16:17:40.783555 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94ba089c_d890_4d57_abc0_258a2b54a6f9.slice/crio-dccc19b081704dfff704f7c5cc37aa24bca9d7c8d0bea0adb581f88dcba2522e WatchSource:0}: Error finding container dccc19b081704dfff704f7c5cc37aa24bca9d7c8d0bea0adb581f88dcba2522e: Status 404 returned error can't find the container with id dccc19b081704dfff704f7c5cc37aa24bca9d7c8d0bea0adb581f88dcba2522e Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.802682 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5tzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd6c7f4c8-qwb4t_openstack-operators(97bb6c80-6996-4e91-bcdf-0f1c20e72fa3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.805279 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t"] Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.810293 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5tzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd6c7f4c8-qwb4t_openstack-operators(97bb6c80-6996-4e91-bcdf-0f1c20e72fa3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.811362 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" podUID="97bb6c80-6996-4e91-bcdf-0f1c20e72fa3" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.811551 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cpzf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-656dcb59d4-hgbtk_openstack-operators(00e99b4b-2bbd-445a-b075-74c47fe30f79): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.814258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47"] Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.819836 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cpzf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-656dcb59d4-hgbtk_openstack-operators(00e99b4b-2bbd-445a-b075-74c47fe30f79): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.821099 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" podUID="00e99b4b-2bbd-445a-b075-74c47fe30f79" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.822385 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:888edf6f432e52eaa5fc3caeae616fe38a3302b006bbba0e38885b2beba9f0f2,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jxs4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-66f4dd4bc7-dfw47_openstack-operators(50771ff9-4409-4f41-ad3c-98f730dbff77): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.824837 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jxs4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-66f4dd4bc7-dfw47_openstack-operators(50771ff9-4409-4f41-ad3c-98f730dbff77): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.825972 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" podUID="50771ff9-4409-4f41-ad3c-98f730dbff77" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.827047 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk"] Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.830684 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p9jdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-v96x7_openstack-operators(6647f986-9d62-4939-907b-fde960b30a37): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.832076 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" podUID="6647f986-9d62-4939-907b-fde960b30a37" Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.840380 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7"] Nov 27 16:17:40 crc kubenswrapper[4707]: I1127 16:17:40.946995 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.947240 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:40 crc kubenswrapper[4707]: E1127 16:17:40.947305 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert podName:88f24787-fedc-4d08-9a8e-16a24f242d02 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:42.947289085 +0000 UTC m=+838.578737853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert") pod "infra-operator-controller-manager-57548d458d-kwlbb" (UID: "88f24787-fedc-4d08-9a8e-16a24f242d02") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.151294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.151496 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.151567 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert podName:7761d2b0-8cc7-4dc8-a956-df20e2efc081 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:43.151550478 +0000 UTC m=+838.782999256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" (UID: "7761d2b0-8cc7-4dc8-a956-df20e2efc081") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.207203 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" event={"ID":"ac946592-ee39-443e-b64a-980caaace080","Type":"ContainerStarted","Data":"7cdd6ae26682a98adde31f396e2bbf379f3a273843f09b87cfe2e227a77b618f"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.208447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" event={"ID":"9377949b-5979-44ff-bd3f-ea1389b4ef6f","Type":"ContainerStarted","Data":"786d3539eb4626daf75c4d7dfae5ea6b06fe917416bebd0b3ce66890325db753"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.214686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" event={"ID":"50771ff9-4409-4f41-ad3c-98f730dbff77","Type":"ContainerStarted","Data":"8284aa9e9e677f44f2b57ae3ec5f8edaf4c8df0f2971ece35780d78480ada7f9"} Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.217666 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:888edf6f432e52eaa5fc3caeae616fe38a3302b006bbba0e38885b2beba9f0f2\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" podUID="50771ff9-4409-4f41-ad3c-98f730dbff77" Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.219973 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" event={"ID":"dc2542ce-f2fd-454b-b47f-92d3bbc93d91","Type":"ContainerStarted","Data":"4067d59b09fbb0e02f6b1f4c1f11c9d1e12a159fb7abc0b7473c028688f581f2"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.224222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" event={"ID":"97bb6c80-6996-4e91-bcdf-0f1c20e72fa3","Type":"ContainerStarted","Data":"805b6c7a87c49cb13e8b729cf946d62aa942b62a67f89f919c24abf5c7e2f446"} Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.225812 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" podUID="97bb6c80-6996-4e91-bcdf-0f1c20e72fa3" Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.230553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" event={"ID":"8fea437e-0a8c-4836-b23c-56db9c7ea0fc","Type":"ContainerStarted","Data":"d226bbf16383d1202bd561fcf4da4c77c65d7eb386de18fe5b1d3ba69fe23b06"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.243399 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" event={"ID":"99785491-bcbd-4946-b1a6-a3e08a4394b5","Type":"ContainerStarted","Data":"94b5f09cf79cbee2f199fce1dfe33e21035981f2b82d5dc101850590e2844fce"} Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.244210 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:d65dbfc956e9cf376f3c48fc3a0942cb7306b5164f898c40d1efca106df81db7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" podUID="8fea437e-0a8c-4836-b23c-56db9c7ea0fc" Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.256925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" event={"ID":"20e41446-8d89-481e-bd9f-48dc14efb82e","Type":"ContainerStarted","Data":"4c8e8e92f5b6a85ace67a190ae224eda939e80edf860249823d06769fa6ac81a"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.264644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" event={"ID":"126e10c6-2740-47fc-8331-a8e4bb6549b8","Type":"ContainerStarted","Data":"027cc1c9b373f4f1cd52a7ea02230e08c438e59cced5aaf934e18e3d40e5570c"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.268642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" event={"ID":"3c3bf501-b545-45a2-b186-2df94990295d","Type":"ContainerStarted","Data":"1a19dcc9d759f04eedbfa12a95d1aaaa6f40d11362eb67269964898d0fb7ee1c"} Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.276614 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" podUID="3c3bf501-b545-45a2-b186-2df94990295d" Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.277466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" event={"ID":"7c88f676-b4d3-46b2-aedd-eff62f8f1bfb","Type":"ContainerStarted","Data":"d57184fb15c906e44955e2ce61ec781129c0e8f9a249799b65d8b862d4cc32a7"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.296550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" event={"ID":"a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660","Type":"ContainerStarted","Data":"c647e9bc2d1a9d908b13cd92c48d556e965e6f03b5367988593245c37e7a2326"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.298440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" event={"ID":"00e99b4b-2bbd-445a-b075-74c47fe30f79","Type":"ContainerStarted","Data":"56916847caf4b7eab162f225c418f70ef4d0e6cc1d5f500b520eeac9fbbc2a13"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.302202 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" event={"ID":"6647f986-9d62-4939-907b-fde960b30a37","Type":"ContainerStarted","Data":"31de40530894ae2cb0acdeb764139bb4d8086edec29aba23ed7b74ac87a6de80"} Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.302288 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" podUID="00e99b4b-2bbd-445a-b075-74c47fe30f79" Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.304791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" event={"ID":"5e9f9859-1f28-4183-b71c-e9459e2746b7","Type":"ContainerStarted","Data":"935f858f79eec61dea23681fffd51621cb8d7a07a1397c344ce05f2eafc07099"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.306766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" event={"ID":"94ba089c-d890-4d57-abc0-258a2b54a6f9","Type":"ContainerStarted","Data":"dccc19b081704dfff704f7c5cc37aa24bca9d7c8d0bea0adb581f88dcba2522e"} Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.308028 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" podUID="6647f986-9d62-4939-907b-fde960b30a37" Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.308911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" event={"ID":"6d78cbb1-56e6-428d-bba4-5d1edbbda363","Type":"ContainerStarted","Data":"3140571e048099653af9bf686f2f225898faf2964838888777ab24640bf5be64"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.314898 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" event={"ID":"6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841","Type":"ContainerStarted","Data":"ad3e4d8501ace9c56b79473089eb26c723b96cd5284125525a7ae8632134b81a"} Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.316687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" event={"ID":"efa00950-13dd-4e9e-a215-6ebb89006545","Type":"ContainerStarted","Data":"81903e142d34f1d144824c57a38640e2a8a070ba4e6714a5a75d564efe62541a"} Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.326217 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" podUID="efa00950-13dd-4e9e-a215-6ebb89006545" Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.663028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:41 crc kubenswrapper[4707]: I1127 16:17:41.663106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.663245 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.663302 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:43.663284922 +0000 UTC m=+839.294733690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "metrics-server-cert" not found Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.663485 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:17:41 crc kubenswrapper[4707]: E1127 16:17:41.663562 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:43.663544829 +0000 UTC m=+839.294993597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "webhook-server-cert" not found Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.353221 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" podUID="6647f986-9d62-4939-907b-fde960b30a37" Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.353563 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:888edf6f432e52eaa5fc3caeae616fe38a3302b006bbba0e38885b2beba9f0f2\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" podUID="50771ff9-4409-4f41-ad3c-98f730dbff77" Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.354118 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:d65dbfc956e9cf376f3c48fc3a0942cb7306b5164f898c40d1efca106df81db7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" podUID="8fea437e-0a8c-4836-b23c-56db9c7ea0fc" Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.354286 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" podUID="3c3bf501-b545-45a2-b186-2df94990295d" Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.354484 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:210517b918e30df1c95fc7d961c8e57e9a9d1cc2b9fe7eb4dad2034dd53a90aa\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" podUID="97bb6c80-6996-4e91-bcdf-0f1c20e72fa3" Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.354534 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" podUID="00e99b4b-2bbd-445a-b075-74c47fe30f79" Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.366193 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" podUID="efa00950-13dd-4e9e-a215-6ebb89006545" Nov 27 16:17:42 crc kubenswrapper[4707]: I1127 16:17:42.983864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.984084 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:42 crc kubenswrapper[4707]: E1127 16:17:42.984130 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert podName:88f24787-fedc-4d08-9a8e-16a24f242d02 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:46.984116093 +0000 UTC m=+842.615564861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert") pod "infra-operator-controller-manager-57548d458d-kwlbb" (UID: "88f24787-fedc-4d08-9a8e-16a24f242d02") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:43 crc kubenswrapper[4707]: I1127 16:17:43.187673 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:43 crc kubenswrapper[4707]: E1127 16:17:43.188352 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:43 crc kubenswrapper[4707]: E1127 16:17:43.188525 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert podName:7761d2b0-8cc7-4dc8-a956-df20e2efc081 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:47.18850921 +0000 UTC m=+842.819957978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" (UID: "7761d2b0-8cc7-4dc8-a956-df20e2efc081") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:43 crc kubenswrapper[4707]: I1127 16:17:43.695062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:43 crc kubenswrapper[4707]: I1127 16:17:43.695166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:43 crc kubenswrapper[4707]: E1127 16:17:43.695250 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:17:43 crc kubenswrapper[4707]: E1127 16:17:43.695289 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:17:43 crc kubenswrapper[4707]: E1127 16:17:43.695323 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:47.695304683 +0000 UTC m=+843.326753451 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "metrics-server-cert" not found Nov 27 16:17:43 crc kubenswrapper[4707]: E1127 16:17:43.695340 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:47.695334174 +0000 UTC m=+843.326782942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "webhook-server-cert" not found Nov 27 16:17:47 crc kubenswrapper[4707]: I1127 16:17:47.056652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:47 crc kubenswrapper[4707]: E1127 16:17:47.056806 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:47 crc kubenswrapper[4707]: E1127 16:17:47.056860 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert podName:88f24787-fedc-4d08-9a8e-16a24f242d02 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:55.056844868 +0000 UTC m=+850.688293636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert") pod "infra-operator-controller-manager-57548d458d-kwlbb" (UID: "88f24787-fedc-4d08-9a8e-16a24f242d02") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:47 crc kubenswrapper[4707]: I1127 16:17:47.259070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:47 crc kubenswrapper[4707]: E1127 16:17:47.259280 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:47 crc kubenswrapper[4707]: E1127 16:17:47.259357 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert podName:7761d2b0-8cc7-4dc8-a956-df20e2efc081 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:55.259339458 +0000 UTC m=+850.890788226 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" (UID: "7761d2b0-8cc7-4dc8-a956-df20e2efc081") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:47 crc kubenswrapper[4707]: I1127 16:17:47.766992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:47 crc kubenswrapper[4707]: E1127 16:17:47.767203 4707 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 16:17:47 crc kubenswrapper[4707]: I1127 16:17:47.767548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:47 crc kubenswrapper[4707]: E1127 16:17:47.767630 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:55.767602987 +0000 UTC m=+851.399051765 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "webhook-server-cert" not found Nov 27 16:17:47 crc kubenswrapper[4707]: E1127 16:17:47.767833 4707 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 16:17:47 crc kubenswrapper[4707]: E1127 16:17:47.767945 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs podName:894c1749-fccf-4178-b7a8-6c63e18266f6 nodeName:}" failed. No retries permitted until 2025-11-27 16:17:55.767917615 +0000 UTC m=+851.399366413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs") pod "openstack-operator-controller-manager-54fcbb7454-dqgm9" (UID: "894c1749-fccf-4178-b7a8-6c63e18266f6") : secret "metrics-server-cert" not found Nov 27 16:17:53 crc kubenswrapper[4707]: E1127 16:17:53.539055 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bbb543d2d67c73e5df5d6357c3251363eb34a99575c5bf10416edd45dbdae2f6" Nov 27 16:17:53 crc kubenswrapper[4707]: E1127 16:17:53.539490 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bbb543d2d67c73e5df5d6357c3251363eb34a99575c5bf10416edd45dbdae2f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bwm9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-56897c768d-nbgp6_openstack-operators(6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:17:54 crc kubenswrapper[4707]: E1127 16:17:54.171131 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:2ee37ff474bee3203447df4f326a9279a515e770573153338296dd074722c677" Nov 27 16:17:54 crc kubenswrapper[4707]: E1127 16:17:54.171588 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:2ee37ff474bee3203447df4f326a9279a515e770573153338296dd074722c677,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nvv4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5b77f656f-8qhwz_openstack-operators(a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:17:54 crc kubenswrapper[4707]: E1127 16:17:54.253615 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/openstack-k8s-operators/telemetry-operator:9f281452e31a8aed56a67366170f98956301b3f8" Nov 27 16:17:54 crc kubenswrapper[4707]: E1127 16:17:54.253673 4707 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.75:5001/openstack-k8s-operators/telemetry-operator:9f281452e31a8aed56a67366170f98956301b3f8" Nov 27 16:17:54 crc kubenswrapper[4707]: E1127 16:17:54.253815 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.75:5001/openstack-k8s-operators/telemetry-operator:9f281452e31a8aed56a67366170f98956301b3f8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jfcls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-8665cb7d49-w7xqm_openstack-operators(94ba089c-d890-4d57-abc0-258a2b54a6f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.110166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:17:55 crc kubenswrapper[4707]: E1127 16:17:55.110365 4707 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:55 crc kubenswrapper[4707]: E1127 16:17:55.110662 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert podName:88f24787-fedc-4d08-9a8e-16a24f242d02 nodeName:}" failed. No retries permitted until 2025-11-27 16:18:11.110644043 +0000 UTC m=+866.742092801 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert") pod "infra-operator-controller-manager-57548d458d-kwlbb" (UID: "88f24787-fedc-4d08-9a8e-16a24f242d02") : secret "infra-operator-webhook-server-cert" not found Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.314393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:17:55 crc kubenswrapper[4707]: E1127 16:17:55.314800 4707 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:55 crc kubenswrapper[4707]: E1127 16:17:55.314907 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert podName:7761d2b0-8cc7-4dc8-a956-df20e2efc081 nodeName:}" failed. No retries permitted until 2025-11-27 16:18:11.314885656 +0000 UTC m=+866.946334424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" (UID: "7761d2b0-8cc7-4dc8-a956-df20e2efc081") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 16:17:55 crc kubenswrapper[4707]: E1127 16:17:55.370886 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mvlbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6b7f75547b-sgwpt_openstack-operators(6d78cbb1-56e6-428d-bba4-5d1edbbda363): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 16:17:55 crc kubenswrapper[4707]: E1127 16:17:55.372071 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" podUID="6d78cbb1-56e6-428d-bba4-5d1edbbda363" Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.468706 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" event={"ID":"7c88f676-b4d3-46b2-aedd-eff62f8f1bfb","Type":"ContainerStarted","Data":"bad9aa6c8b6836d85572ace476e45a38f36f3851b57fc0b54706297b363f3b41"} Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.479816 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" event={"ID":"9377949b-5979-44ff-bd3f-ea1389b4ef6f","Type":"ContainerStarted","Data":"44e2ebec5bc09fe070565cab075285555caa10bda218fd6849ebfb2afc1c2a38"} Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.503966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" event={"ID":"99785491-bcbd-4946-b1a6-a3e08a4394b5","Type":"ContainerStarted","Data":"10c00bf6d5960a8f488a67a9f620b253c55418a65b8d0e008be77ff62a5570e5"} Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.507043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" event={"ID":"5e9f9859-1f28-4183-b71c-e9459e2746b7","Type":"ContainerStarted","Data":"8c216f174dc56fee3bd82d01b722e7f0e4fda734203319e0954e1a8c02eb1e27"} Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.508865 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" event={"ID":"6d78cbb1-56e6-428d-bba4-5d1edbbda363","Type":"ContainerStarted","Data":"da0652d9fe9031c6f2c8a41f904a577083044cd07e8e1b2796d76fc1450a7d02"} Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.509014 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.510196 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" event={"ID":"dc2542ce-f2fd-454b-b47f-92d3bbc93d91","Type":"ContainerStarted","Data":"616d660af3311cb33032cb8e1eae2ef89a1b0891e11f1ac4a32485f1b860f435"} Nov 27 16:17:55 crc kubenswrapper[4707]: E1127 16:17:55.510505 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" podUID="6d78cbb1-56e6-428d-bba4-5d1edbbda363" Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.513896 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" event={"ID":"20e41446-8d89-481e-bd9f-48dc14efb82e","Type":"ContainerStarted","Data":"2eeda95e27274134710ecb9d128af1075cdcafed74f758f2fd67ede0dd909b40"} Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.516658 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" event={"ID":"0267ac3a-4bee-42b9-a506-e2b1e1e3726e","Type":"ContainerStarted","Data":"ee50ad9bf4fbb6c4c47c952991e6d5b9490a4ff181519e85b1d80551c6039ac4"} Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.520412 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" event={"ID":"ac946592-ee39-443e-b64a-980caaace080","Type":"ContainerStarted","Data":"d2c162261eaec7e14149a2a678401e91c9b7a24d65291d807c0eeff632b714b6"} Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.835903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.836144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.841772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-metrics-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:55 crc kubenswrapper[4707]: I1127 16:17:55.841906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/894c1749-fccf-4178-b7a8-6c63e18266f6-webhook-certs\") pod \"openstack-operator-controller-manager-54fcbb7454-dqgm9\" (UID: \"894c1749-fccf-4178-b7a8-6c63e18266f6\") " pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:56 crc kubenswrapper[4707]: I1127 16:17:56.083027 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-n8k5f" Nov 27 16:17:56 crc kubenswrapper[4707]: I1127 16:17:56.091627 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:17:56 crc kubenswrapper[4707]: I1127 16:17:56.527909 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" event={"ID":"126e10c6-2740-47fc-8331-a8e4bb6549b8","Type":"ContainerStarted","Data":"0d00b5f8485a669d6680dfe4c098445cb30ab8748664a0af2834d3d86a4bb0ba"} Nov 27 16:17:56 crc kubenswrapper[4707]: E1127 16:17:56.528576 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" podUID="6d78cbb1-56e6-428d-bba4-5d1edbbda363" Nov 27 16:18:02 crc kubenswrapper[4707]: I1127 16:18:02.875754 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9"] Nov 27 16:18:02 crc kubenswrapper[4707]: W1127 16:18:02.928806 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod894c1749_fccf_4178_b7a8_6c63e18266f6.slice/crio-23388709991d23dd9e1c8d593c7bbff331451e8f99328486dd7005dd415521c8 WatchSource:0}: Error finding container 23388709991d23dd9e1c8d593c7bbff331451e8f99328486dd7005dd415521c8: Status 404 returned error can't find the container with id 23388709991d23dd9e1c8d593c7bbff331451e8f99328486dd7005dd415521c8 Nov 27 16:18:03 crc kubenswrapper[4707]: E1127 16:18:03.082714 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" podUID="a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.573407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" event={"ID":"3c3bf501-b545-45a2-b186-2df94990295d","Type":"ContainerStarted","Data":"def257f50084e289b6a9f8859dc103a4a75e5a7cdf68e47e016035b42613b9ed"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.574986 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" event={"ID":"50771ff9-4409-4f41-ad3c-98f730dbff77","Type":"ContainerStarted","Data":"f9e8685b8c77069d56afa0f8e86f684cc2f17e9dbfe589da3e8896b2be57a25f"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.576999 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" event={"ID":"7c88f676-b4d3-46b2-aedd-eff62f8f1bfb","Type":"ContainerStarted","Data":"1c72d9d6674dd1085698535484eea89ad8d056a494ff63e20fa6acd625de9843"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.577913 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.580060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" event={"ID":"a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660","Type":"ContainerStarted","Data":"5be9485df5015d09e183642f3b9e85866577d11dfd8b9cf4a7bd6913be43a0c2"} Nov 27 16:18:03 crc kubenswrapper[4707]: E1127 16:18:03.580939 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:2ee37ff474bee3203447df4f326a9279a515e770573153338296dd074722c677\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" podUID="a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.582426 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.583580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" event={"ID":"97bb6c80-6996-4e91-bcdf-0f1c20e72fa3","Type":"ContainerStarted","Data":"6ff217355661c06a470a5a3640a7dc75e7d00cee5fea45578e3cebc4c80f7dea"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.585019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" event={"ID":"6647f986-9d62-4939-907b-fde960b30a37","Type":"ContainerStarted","Data":"0cc953d0426a8d163a1bfc44ca09c4684a912f46a69e849f7b9dcc1df1875cae"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.587608 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" event={"ID":"894c1749-fccf-4178-b7a8-6c63e18266f6","Type":"ContainerStarted","Data":"23388709991d23dd9e1c8d593c7bbff331451e8f99328486dd7005dd415521c8"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.598692 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-blnpn" podStartSLOduration=2.588685815 podStartE2EDuration="24.598673974s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.654131564 +0000 UTC m=+836.285580332" lastFinishedPulling="2025-11-27 16:18:02.664119713 +0000 UTC m=+858.295568491" observedRunningTime="2025-11-27 16:18:03.594256366 +0000 UTC m=+859.225705134" watchObservedRunningTime="2025-11-27 16:18:03.598673974 +0000 UTC m=+859.230122742" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.602005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" event={"ID":"0267ac3a-4bee-42b9-a506-e2b1e1e3726e","Type":"ContainerStarted","Data":"080cf4802a04303505631a83dfd6c7d9b2122f61ebbbb49138ffb3f6f3c0c4ea"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.602977 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.604531 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.609485 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" event={"ID":"efa00950-13dd-4e9e-a215-6ebb89006545","Type":"ContainerStarted","Data":"2e619212a2cb21d1ceb4db4437128c3da84e4b454db603366c538a508d8bc621"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.614789 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-v96x7" podStartSLOduration=2.943847174 podStartE2EDuration="24.614774748s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.830537265 +0000 UTC m=+836.461986033" lastFinishedPulling="2025-11-27 16:18:02.501464829 +0000 UTC m=+858.132913607" observedRunningTime="2025-11-27 16:18:03.613551428 +0000 UTC m=+859.245000196" watchObservedRunningTime="2025-11-27 16:18:03.614774748 +0000 UTC m=+859.246223516" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.617388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" event={"ID":"00e99b4b-2bbd-445a-b075-74c47fe30f79","Type":"ContainerStarted","Data":"5b6d40a85eab099f0182f676055dba065744a6873eaebd8c8015dd94cbcd9258"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.618813 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" event={"ID":"8fea437e-0a8c-4836-b23c-56db9c7ea0fc","Type":"ContainerStarted","Data":"7454d03dd4d01b5bbe2724ffddac40448703afd113e1e1a474d0adc1d3168fce"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.620344 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" event={"ID":"9377949b-5979-44ff-bd3f-ea1389b4ef6f","Type":"ContainerStarted","Data":"e493c4c866b62a5e434d58661ebab3cec188ce8531b370f3209077c25b17e745"} Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.621205 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.623238 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" Nov 27 16:18:03 crc kubenswrapper[4707]: E1127 16:18:03.637702 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" podUID="94ba089c-d890-4d57-abc0-258a2b54a6f9" Nov 27 16:18:03 crc kubenswrapper[4707]: E1127 16:18:03.711935 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" podUID="6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.713988 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-nhkfx" podStartSLOduration=2.19249159 podStartE2EDuration="24.713976338s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.127544126 +0000 UTC m=+835.758992904" lastFinishedPulling="2025-11-27 16:18:02.649028874 +0000 UTC m=+858.280477652" observedRunningTime="2025-11-27 16:18:03.70590416 +0000 UTC m=+859.337352928" watchObservedRunningTime="2025-11-27 16:18:03.713976338 +0000 UTC m=+859.345425096" Nov 27 16:18:03 crc kubenswrapper[4707]: I1127 16:18:03.753457 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-4bqw7" podStartSLOduration=2.533201226 podStartE2EDuration="24.753433335s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.445459113 +0000 UTC m=+836.076907881" lastFinishedPulling="2025-11-27 16:18:02.665691202 +0000 UTC m=+858.297139990" observedRunningTime="2025-11-27 16:18:03.736955741 +0000 UTC m=+859.368404509" watchObservedRunningTime="2025-11-27 16:18:03.753433335 +0000 UTC m=+859.384882103" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.626766 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" event={"ID":"ac946592-ee39-443e-b64a-980caaace080","Type":"ContainerStarted","Data":"2ac2e607bd64d9698deb768456cfd21eec8cf30f77ffbdfa86eb43197c98b463"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.627031 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.628269 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" event={"ID":"99785491-bcbd-4946-b1a6-a3e08a4394b5","Type":"ContainerStarted","Data":"174996df0971ba9d9b57c7b3ccf383101406875fac8fe712a11eb8a2c78449f1"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.628460 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.628760 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.629727 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.629932 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" event={"ID":"20e41446-8d89-481e-bd9f-48dc14efb82e","Type":"ContainerStarted","Data":"d11f3f6f13438525d3d0de43b372fdfd4d737d53232797d1a7e3fd402a067eab"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.630094 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.631532 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" event={"ID":"efa00950-13dd-4e9e-a215-6ebb89006545","Type":"ContainerStarted","Data":"45176cc70a3ae9858a9d4511df764b3e4ccf3a36aa1d5987d26b23bbca507b0e"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.631578 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.632200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.633080 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" event={"ID":"5e9f9859-1f28-4183-b71c-e9459e2746b7","Type":"ContainerStarted","Data":"cb042af645d31ecf0db055539b49c0808e4b68e09158b1a7c74965837375085c"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.633214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.634713 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.635266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" event={"ID":"3c3bf501-b545-45a2-b186-2df94990295d","Type":"ContainerStarted","Data":"b7a0febde020b2a6a27a7a96677a794159e50873abd107c37146383939a50dea"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.635352 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.636666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" event={"ID":"50771ff9-4409-4f41-ad3c-98f730dbff77","Type":"ContainerStarted","Data":"e68f6e61de0ea5805560760b98d0048426b5f1e819bf9e949399e0b0ea6e5d48"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.636785 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.637744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" event={"ID":"6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841","Type":"ContainerStarted","Data":"118cb1a7a57516aad309068389649e89fb633e56f90380905477ba85c258dd3f"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.639099 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" event={"ID":"00e99b4b-2bbd-445a-b075-74c47fe30f79","Type":"ContainerStarted","Data":"3259d93800297f39adcbf5426d852f0fd2240026cec4e881e87264ac2b07d662"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.639190 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.641006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" event={"ID":"894c1749-fccf-4178-b7a8-6c63e18266f6","Type":"ContainerStarted","Data":"bf3efbd4878cde0d3697646057a3fccbb041897c2a809e7dd960d8ad23682fa2"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.641120 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.642547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" event={"ID":"dc2542ce-f2fd-454b-b47f-92d3bbc93d91","Type":"ContainerStarted","Data":"b874e7df121b3e45d90da0bbbc9f7b076435d284a2552d99892bb288fec50ecc"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.642678 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.643985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" event={"ID":"97bb6c80-6996-4e91-bcdf-0f1c20e72fa3","Type":"ContainerStarted","Data":"54df8351a63473fb43453685682408c1ba182018f77c9c97eaa61403971e2f74"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.644067 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.644095 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.645658 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" event={"ID":"126e10c6-2740-47fc-8331-a8e4bb6549b8","Type":"ContainerStarted","Data":"1c65bd6f7c5eb3f9ba797eb155148009280a934460f9e2c2639d529fcef30d18"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.645760 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.646952 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.647205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" event={"ID":"8fea437e-0a8c-4836-b23c-56db9c7ea0fc","Type":"ContainerStarted","Data":"11ba30e00bb824d604aca8b6a4efda6f51e3c4925ce4c4180cc623e411294d91"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.647409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.649639 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" event={"ID":"94ba089c-d890-4d57-abc0-258a2b54a6f9","Type":"ContainerStarted","Data":"d38ff78c947af24094f3956e097bd27fa585977c67f5e5d0426b8d6f675f4f65"} Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.654392 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-hg8j4" podStartSLOduration=3.458117019 podStartE2EDuration="25.654376361s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.444974021 +0000 UTC m=+836.076422789" lastFinishedPulling="2025-11-27 16:18:02.641233333 +0000 UTC m=+858.272682131" observedRunningTime="2025-11-27 16:18:04.648002895 +0000 UTC m=+860.279451663" watchObservedRunningTime="2025-11-27 16:18:04.654376361 +0000 UTC m=+860.285825129" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.706773 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-cd4lh" podStartSLOduration=3.32624509 podStartE2EDuration="25.706755574s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.29139413 +0000 UTC m=+835.922842898" lastFinishedPulling="2025-11-27 16:18:02.671904594 +0000 UTC m=+858.303353382" observedRunningTime="2025-11-27 16:18:04.691192063 +0000 UTC m=+860.322640821" watchObservedRunningTime="2025-11-27 16:18:04.706755574 +0000 UTC m=+860.338204332" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.707312 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" podStartSLOduration=4.544761325 podStartE2EDuration="25.707305197s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.68989433 +0000 UTC m=+836.321343088" lastFinishedPulling="2025-11-27 16:18:01.852438162 +0000 UTC m=+857.483886960" observedRunningTime="2025-11-27 16:18:04.706161789 +0000 UTC m=+860.337610557" watchObservedRunningTime="2025-11-27 16:18:04.707305197 +0000 UTC m=+860.338753965" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.723658 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" podStartSLOduration=4.009879945 podStartE2EDuration="25.723643048s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.669828729 +0000 UTC m=+836.301277497" lastFinishedPulling="2025-11-27 16:18:02.383591792 +0000 UTC m=+858.015040600" observedRunningTime="2025-11-27 16:18:04.720213414 +0000 UTC m=+860.351662182" watchObservedRunningTime="2025-11-27 16:18:04.723643048 +0000 UTC m=+860.355091816" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.773565 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" podStartSLOduration=25.77355699 podStartE2EDuration="25.77355699s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:18:04.772851863 +0000 UTC m=+860.404300631" watchObservedRunningTime="2025-11-27 16:18:04.77355699 +0000 UTC m=+860.405005758" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.793049 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-q6vcr" podStartSLOduration=3.943426416 podStartE2EDuration="25.793029837s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.65313619 +0000 UTC m=+836.284584958" lastFinishedPulling="2025-11-27 16:18:02.502739621 +0000 UTC m=+858.134188379" observedRunningTime="2025-11-27 16:18:04.788727382 +0000 UTC m=+860.420176150" watchObservedRunningTime="2025-11-27 16:18:04.793029837 +0000 UTC m=+860.424478605" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.824019 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-4cd96" podStartSLOduration=3.60017899 podStartE2EDuration="25.824000226s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.445880744 +0000 UTC m=+836.077329512" lastFinishedPulling="2025-11-27 16:18:02.66970198 +0000 UTC m=+858.301150748" observedRunningTime="2025-11-27 16:18:04.814948824 +0000 UTC m=+860.446397582" watchObservedRunningTime="2025-11-27 16:18:04.824000226 +0000 UTC m=+860.455448994" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.835336 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-x5k9j" podStartSLOduration=3.609350184 podStartE2EDuration="25.835316953s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.445238108 +0000 UTC m=+836.076686876" lastFinishedPulling="2025-11-27 16:18:02.671204877 +0000 UTC m=+858.302653645" observedRunningTime="2025-11-27 16:18:04.832390071 +0000 UTC m=+860.463838839" watchObservedRunningTime="2025-11-27 16:18:04.835316953 +0000 UTC m=+860.466765721" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.858519 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" podStartSLOduration=6.984659287 podStartE2EDuration="25.858502401s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.80255653 +0000 UTC m=+836.434005298" lastFinishedPulling="2025-11-27 16:17:59.676399644 +0000 UTC m=+855.307848412" observedRunningTime="2025-11-27 16:18:04.852662768 +0000 UTC m=+860.484111536" watchObservedRunningTime="2025-11-27 16:18:04.858502401 +0000 UTC m=+860.489951169" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.874265 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" podStartSLOduration=7.022841703 podStartE2EDuration="25.874250667s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.811448558 +0000 UTC m=+836.442897326" lastFinishedPulling="2025-11-27 16:17:59.662857522 +0000 UTC m=+855.294306290" observedRunningTime="2025-11-27 16:18:04.867908841 +0000 UTC m=+860.499357609" watchObservedRunningTime="2025-11-27 16:18:04.874250667 +0000 UTC m=+860.505699435" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.886766 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" podStartSLOduration=5.880673327 podStartE2EDuration="25.886750253s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.701953306 +0000 UTC m=+836.333402074" lastFinishedPulling="2025-11-27 16:18:00.708030212 +0000 UTC m=+856.339479000" observedRunningTime="2025-11-27 16:18:04.885721757 +0000 UTC m=+860.517170525" watchObservedRunningTime="2025-11-27 16:18:04.886750253 +0000 UTC m=+860.518199021" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.902743 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-955677c94-f5g2d" podStartSLOduration=3.763672184 podStartE2EDuration="25.902724744s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.445675569 +0000 UTC m=+836.077124337" lastFinishedPulling="2025-11-27 16:18:02.584728129 +0000 UTC m=+858.216176897" observedRunningTime="2025-11-27 16:18:04.897995798 +0000 UTC m=+860.529444586" watchObservedRunningTime="2025-11-27 16:18:04.902724744 +0000 UTC m=+860.534173512" Nov 27 16:18:04 crc kubenswrapper[4707]: I1127 16:18:04.944789 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" podStartSLOduration=4.381744073 podStartE2EDuration="25.944775634s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.822200761 +0000 UTC m=+836.453649529" lastFinishedPulling="2025-11-27 16:18:02.385232322 +0000 UTC m=+858.016681090" observedRunningTime="2025-11-27 16:18:04.942027627 +0000 UTC m=+860.573476395" watchObservedRunningTime="2025-11-27 16:18:04.944775634 +0000 UTC m=+860.576224402" Nov 27 16:18:05 crc kubenswrapper[4707]: I1127 16:18:05.665010 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" event={"ID":"94ba089c-d890-4d57-abc0-258a2b54a6f9","Type":"ContainerStarted","Data":"adc6e701090905cebced4fdee1ff3550fd1e643affd1979fb7377b213c755977"} Nov 27 16:18:05 crc kubenswrapper[4707]: I1127 16:18:05.706465 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" podStartSLOduration=2.426505222 podStartE2EDuration="26.70644851s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.784849816 +0000 UTC m=+836.416298584" lastFinishedPulling="2025-11-27 16:18:05.064793104 +0000 UTC m=+860.696241872" observedRunningTime="2025-11-27 16:18:05.703024736 +0000 UTC m=+861.334473524" watchObservedRunningTime="2025-11-27 16:18:05.70644851 +0000 UTC m=+861.337897278" Nov 27 16:18:06 crc kubenswrapper[4707]: I1127 16:18:06.680997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" event={"ID":"a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660","Type":"ContainerStarted","Data":"9898a38c60da162149f7adff54f2238cbc3a498214a5976ec1d2f6557469d925"} Nov 27 16:18:06 crc kubenswrapper[4707]: I1127 16:18:06.682105 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" Nov 27 16:18:06 crc kubenswrapper[4707]: I1127 16:18:06.684552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" event={"ID":"6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841","Type":"ContainerStarted","Data":"4857fd165c44456e52de3ecf9cb5268136c0ea7dd4e24f902de29e429825850b"} Nov 27 16:18:06 crc kubenswrapper[4707]: I1127 16:18:06.685541 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" Nov 27 16:18:06 crc kubenswrapper[4707]: I1127 16:18:06.707522 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" podStartSLOduration=2.065748056 podStartE2EDuration="27.707485859s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.291337018 +0000 UTC m=+835.922785786" lastFinishedPulling="2025-11-27 16:18:05.933074821 +0000 UTC m=+861.564523589" observedRunningTime="2025-11-27 16:18:06.704000474 +0000 UTC m=+862.335449252" watchObservedRunningTime="2025-11-27 16:18:06.707485859 +0000 UTC m=+862.338934667" Nov 27 16:18:06 crc kubenswrapper[4707]: I1127 16:18:06.731778 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" podStartSLOduration=2.691188366 podStartE2EDuration="27.731714023s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.653084999 +0000 UTC m=+836.284533767" lastFinishedPulling="2025-11-27 16:18:05.693610646 +0000 UTC m=+861.325059424" observedRunningTime="2025-11-27 16:18:06.726986697 +0000 UTC m=+862.358435475" watchObservedRunningTime="2025-11-27 16:18:06.731714023 +0000 UTC m=+862.363162791" Nov 27 16:18:07 crc kubenswrapper[4707]: I1127 16:18:07.692199 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" Nov 27 16:18:09 crc kubenswrapper[4707]: I1127 16:18:09.211923 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" Nov 27 16:18:09 crc kubenswrapper[4707]: I1127 16:18:09.543977 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-5bj45" Nov 27 16:18:09 crc kubenswrapper[4707]: I1127 16:18:09.712407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" event={"ID":"6d78cbb1-56e6-428d-bba4-5d1edbbda363","Type":"ContainerStarted","Data":"d459218f136e64eb21552f55a729b7bcb6f865f4766b6c9dfc558f2d18a5de36"} Nov 27 16:18:09 crc kubenswrapper[4707]: I1127 16:18:09.718458 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d77b94747-qdh2s" Nov 27 16:18:09 crc kubenswrapper[4707]: I1127 16:18:09.740533 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-sgwpt" podStartSLOduration=16.762434369 podStartE2EDuration="30.740516639s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:17:40.300180345 +0000 UTC m=+835.931629103" lastFinishedPulling="2025-11-27 16:17:54.278262565 +0000 UTC m=+849.909711373" observedRunningTime="2025-11-27 16:18:09.733693801 +0000 UTC m=+865.365142579" watchObservedRunningTime="2025-11-27 16:18:09.740516639 +0000 UTC m=+865.371965417" Nov 27 16:18:09 crc kubenswrapper[4707]: I1127 16:18:09.960543 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-dfw47" Nov 27 16:18:09 crc kubenswrapper[4707]: I1127 16:18:09.965401 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-v48r5" Nov 27 16:18:10 crc kubenswrapper[4707]: I1127 16:18:10.046661 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-8665cb7d49-w7xqm" Nov 27 16:18:10 crc kubenswrapper[4707]: I1127 16:18:10.061032 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-qwb4t" Nov 27 16:18:10 crc kubenswrapper[4707]: I1127 16:18:10.136621 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-hgbtk" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.141671 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.151172 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88f24787-fedc-4d08-9a8e-16a24f242d02-cert\") pod \"infra-operator-controller-manager-57548d458d-kwlbb\" (UID: \"88f24787-fedc-4d08-9a8e-16a24f242d02\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.293205 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-vxdsk" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.301722 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.346738 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.356217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7761d2b0-8cc7-4dc8-a956-df20e2efc081-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww\" (UID: \"7761d2b0-8cc7-4dc8-a956-df20e2efc081\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.423785 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fcf6c" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.432551 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.602947 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb"] Nov 27 16:18:11 crc kubenswrapper[4707]: W1127 16:18:11.609131 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f24787_fedc_4d08_9a8e_16a24f242d02.slice/crio-1579c64099848811c601536983c01d67935a746925eba229deb80dfc9e4e6ca8 WatchSource:0}: Error finding container 1579c64099848811c601536983c01d67935a746925eba229deb80dfc9e4e6ca8: Status 404 returned error can't find the container with id 1579c64099848811c601536983c01d67935a746925eba229deb80dfc9e4e6ca8 Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.730061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" event={"ID":"88f24787-fedc-4d08-9a8e-16a24f242d02","Type":"ContainerStarted","Data":"1579c64099848811c601536983c01d67935a746925eba229deb80dfc9e4e6ca8"} Nov 27 16:18:11 crc kubenswrapper[4707]: I1127 16:18:11.931367 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww"] Nov 27 16:18:11 crc kubenswrapper[4707]: W1127 16:18:11.941317 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7761d2b0_8cc7_4dc8_a956_df20e2efc081.slice/crio-0c88e8db1ea6ff1341586f48910aca010e931b0a73d589ea70935372ffa191a6 WatchSource:0}: Error finding container 0c88e8db1ea6ff1341586f48910aca010e931b0a73d589ea70935372ffa191a6: Status 404 returned error can't find the container with id 0c88e8db1ea6ff1341586f48910aca010e931b0a73d589ea70935372ffa191a6 Nov 27 16:18:12 crc kubenswrapper[4707]: I1127 16:18:12.742956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" event={"ID":"7761d2b0-8cc7-4dc8-a956-df20e2efc081","Type":"ContainerStarted","Data":"0c88e8db1ea6ff1341586f48910aca010e931b0a73d589ea70935372ffa191a6"} Nov 27 16:18:16 crc kubenswrapper[4707]: I1127 16:18:16.103027 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54fcbb7454-dqgm9" Nov 27 16:18:19 crc kubenswrapper[4707]: I1127 16:18:19.487886 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-8qhwz" Nov 27 16:18:19 crc kubenswrapper[4707]: I1127 16:18:19.859354 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-nbgp6" Nov 27 16:18:22 crc kubenswrapper[4707]: I1127 16:18:22.828504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" event={"ID":"88f24787-fedc-4d08-9a8e-16a24f242d02","Type":"ContainerStarted","Data":"ad8414c84b5865e0f7d8ed17c01f017ef113f7bf88f7c06069a598c242c96863"} Nov 27 16:18:22 crc kubenswrapper[4707]: I1127 16:18:22.832224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" event={"ID":"7761d2b0-8cc7-4dc8-a956-df20e2efc081","Type":"ContainerStarted","Data":"208e099ef92189902f937a92b96feb0c9212fd9b38072f2d78d02931239c2f3d"} Nov 27 16:18:23 crc kubenswrapper[4707]: I1127 16:18:23.844209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" event={"ID":"88f24787-fedc-4d08-9a8e-16a24f242d02","Type":"ContainerStarted","Data":"4a8d1f117fda26a037e7fb53fbac479426cc61685c83b3d02f24331d99a56418"} Nov 27 16:18:23 crc kubenswrapper[4707]: I1127 16:18:23.844328 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:18:23 crc kubenswrapper[4707]: I1127 16:18:23.847437 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" event={"ID":"7761d2b0-8cc7-4dc8-a956-df20e2efc081","Type":"ContainerStarted","Data":"5ebb8c815d664308cd8735a8a192be7ad01c6e592e8571040e9348943e9edc69"} Nov 27 16:18:23 crc kubenswrapper[4707]: I1127 16:18:23.847630 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:18:23 crc kubenswrapper[4707]: I1127 16:18:23.869096 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" podStartSLOduration=34.017869492 podStartE2EDuration="44.869079296s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:18:11.612474609 +0000 UTC m=+867.243923367" lastFinishedPulling="2025-11-27 16:18:22.463684393 +0000 UTC m=+878.095133171" observedRunningTime="2025-11-27 16:18:23.868038001 +0000 UTC m=+879.499486779" watchObservedRunningTime="2025-11-27 16:18:23.869079296 +0000 UTC m=+879.500528074" Nov 27 16:18:23 crc kubenswrapper[4707]: I1127 16:18:23.921253 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" podStartSLOduration=34.42683657 podStartE2EDuration="44.921226624s" podCreationTimestamp="2025-11-27 16:17:39 +0000 UTC" firstStartedPulling="2025-11-27 16:18:11.946409098 +0000 UTC m=+867.577857916" lastFinishedPulling="2025-11-27 16:18:22.440799202 +0000 UTC m=+878.072247970" observedRunningTime="2025-11-27 16:18:23.907801485 +0000 UTC m=+879.539250283" watchObservedRunningTime="2025-11-27 16:18:23.921226624 +0000 UTC m=+879.552675432" Nov 27 16:18:31 crc kubenswrapper[4707]: I1127 16:18:31.311355 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-kwlbb" Nov 27 16:18:31 crc kubenswrapper[4707]: I1127 16:18:31.440726 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww" Nov 27 16:18:46 crc kubenswrapper[4707]: I1127 16:18:46.951927 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nb8x"] Nov 27 16:18:46 crc kubenswrapper[4707]: I1127 16:18:46.959288 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:18:46 crc kubenswrapper[4707]: I1127 16:18:46.962772 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 27 16:18:46 crc kubenswrapper[4707]: I1127 16:18:46.962860 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xgps4" Nov 27 16:18:46 crc kubenswrapper[4707]: I1127 16:18:46.962865 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 27 16:18:46 crc kubenswrapper[4707]: I1127 16:18:46.963082 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 27 16:18:46 crc kubenswrapper[4707]: I1127 16:18:46.966670 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nb8x"] Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.006672 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g4jvb"] Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.008053 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.013932 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.040541 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g4jvb"] Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.082202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxrtl\" (UniqueName: \"kubernetes.io/projected/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-kube-api-access-hxrtl\") pod \"dnsmasq-dns-675f4bcbfc-5nb8x\" (UID: \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.082252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-config\") pod \"dnsmasq-dns-675f4bcbfc-5nb8x\" (UID: \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.184176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxrtl\" (UniqueName: \"kubernetes.io/projected/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-kube-api-access-hxrtl\") pod \"dnsmasq-dns-675f4bcbfc-5nb8x\" (UID: \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.184282 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-config\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.184333 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-config\") pod \"dnsmasq-dns-675f4bcbfc-5nb8x\" (UID: \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.184481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.184660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvcgn\" (UniqueName: \"kubernetes.io/projected/cb1028a6-b188-4015-9504-94a1d1ef7f6d-kube-api-access-mvcgn\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.185103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-config\") pod \"dnsmasq-dns-675f4bcbfc-5nb8x\" (UID: \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.204760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxrtl\" (UniqueName: \"kubernetes.io/projected/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-kube-api-access-hxrtl\") pod \"dnsmasq-dns-675f4bcbfc-5nb8x\" (UID: \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.286421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-config\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.286856 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.287071 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvcgn\" (UniqueName: \"kubernetes.io/projected/cb1028a6-b188-4015-9504-94a1d1ef7f6d-kube-api-access-mvcgn\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.288241 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.288337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-config\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.299104 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.302960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvcgn\" (UniqueName: \"kubernetes.io/projected/cb1028a6-b188-4015-9504-94a1d1ef7f6d-kube-api-access-mvcgn\") pod \"dnsmasq-dns-78dd6ddcc-g4jvb\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.325241 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.566570 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nb8x"] Nov 27 16:18:47 crc kubenswrapper[4707]: W1127 16:18:47.849041 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb1028a6_b188_4015_9504_94a1d1ef7f6d.slice/crio-e8b257d50607bbfb491e7f03a0ea5cb542348eeb3712e5c6a069edc3dcf76f70 WatchSource:0}: Error finding container e8b257d50607bbfb491e7f03a0ea5cb542348eeb3712e5c6a069edc3dcf76f70: Status 404 returned error can't find the container with id e8b257d50607bbfb491e7f03a0ea5cb542348eeb3712e5c6a069edc3dcf76f70 Nov 27 16:18:47 crc kubenswrapper[4707]: I1127 16:18:47.849416 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g4jvb"] Nov 27 16:18:48 crc kubenswrapper[4707]: I1127 16:18:48.068804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" event={"ID":"0cb7bf93-29b6-402c-86c7-24d3a6b1b135","Type":"ContainerStarted","Data":"0dfacdec9c6ed930106e6073d237e486bfb76999d87ad2fbb23bb02cd7798467"} Nov 27 16:18:48 crc kubenswrapper[4707]: I1127 16:18:48.071076 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" event={"ID":"cb1028a6-b188-4015-9504-94a1d1ef7f6d","Type":"ContainerStarted","Data":"e8b257d50607bbfb491e7f03a0ea5cb542348eeb3712e5c6a069edc3dcf76f70"} Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.099743 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nb8x"] Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.110516 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wzqgl"] Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.115926 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.119612 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wzqgl"] Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.244795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.244875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbs75\" (UniqueName: \"kubernetes.io/projected/a2e43d9c-f196-474a-ae4b-ff985c9ab550-kube-api-access-bbs75\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.244938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-config\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.345791 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.345853 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbs75\" (UniqueName: \"kubernetes.io/projected/a2e43d9c-f196-474a-ae4b-ff985c9ab550-kube-api-access-bbs75\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.345888 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-config\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.352826 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-config\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.355906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-dns-svc\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.361755 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g4jvb"] Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.401681 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hdvpv"] Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.404095 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.405615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbs75\" (UniqueName: \"kubernetes.io/projected/a2e43d9c-f196-474a-ae4b-ff985c9ab550-kube-api-access-bbs75\") pod \"dnsmasq-dns-666b6646f7-wzqgl\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.411223 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hdvpv"] Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.441669 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.549658 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.549707 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzmnh\" (UniqueName: \"kubernetes.io/projected/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-kube-api-access-zzmnh\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.549844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-config\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.651323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.651389 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzmnh\" (UniqueName: \"kubernetes.io/projected/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-kube-api-access-zzmnh\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.651444 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-config\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.652852 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-config\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.652880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.697096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzmnh\" (UniqueName: \"kubernetes.io/projected/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-kube-api-access-zzmnh\") pod \"dnsmasq-dns-57d769cc4f-hdvpv\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.727589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.905893 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wzqgl"] Nov 27 16:18:50 crc kubenswrapper[4707]: W1127 16:18:50.917651 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2e43d9c_f196_474a_ae4b_ff985c9ab550.slice/crio-508e86d453a5a96ebfba51d7f7b2aadeeb94f10aa7a775ab7002c5ef41ad53e5 WatchSource:0}: Error finding container 508e86d453a5a96ebfba51d7f7b2aadeeb94f10aa7a775ab7002c5ef41ad53e5: Status 404 returned error can't find the container with id 508e86d453a5a96ebfba51d7f7b2aadeeb94f10aa7a775ab7002c5ef41ad53e5 Nov 27 16:18:50 crc kubenswrapper[4707]: I1127 16:18:50.920150 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.112843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" event={"ID":"a2e43d9c-f196-474a-ae4b-ff985c9ab550","Type":"ContainerStarted","Data":"508e86d453a5a96ebfba51d7f7b2aadeeb94f10aa7a775ab7002c5ef41ad53e5"} Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.191258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hdvpv"] Nov 27 16:18:51 crc kubenswrapper[4707]: W1127 16:18:51.196790 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98f69e7a_a5b6_4358_9699_5a8f83ff8a5f.slice/crio-92e0ca44752d223d3b01a36323dc4c4cdf5836fd6b0e354c503a2674fb7c3af9 WatchSource:0}: Error finding container 92e0ca44752d223d3b01a36323dc4c4cdf5836fd6b0e354c503a2674fb7c3af9: Status 404 returned error can't find the container with id 92e0ca44752d223d3b01a36323dc4c4cdf5836fd6b0e354c503a2674fb7c3af9 Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.239101 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.246433 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.249442 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.249695 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.249816 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.250020 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.250128 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wwft8" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.250707 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.258119 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.273181 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375084 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375112 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b31a7b86-c43f-4123-a33d-ffba2ee3d015-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b31a7b86-c43f-4123-a33d-ffba2ee3d015-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375219 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375246 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375269 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q74dh\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-kube-api-access-q74dh\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.375311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-config-data\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477233 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477276 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477319 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b31a7b86-c43f-4123-a33d-ffba2ee3d015-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477344 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b31a7b86-c43f-4123-a33d-ffba2ee3d015-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477388 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477471 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q74dh\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-kube-api-access-q74dh\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.477495 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-config-data\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.478404 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-config-data\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.482729 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.483247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.483500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.484033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.485304 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.485857 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.486025 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b31a7b86-c43f-4123-a33d-ffba2ee3d015-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.488125 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.494097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b31a7b86-c43f-4123-a33d-ffba2ee3d015-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.494630 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.494887 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.495168 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-p26dh" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.495357 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.495561 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.495676 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.495767 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.495901 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.501051 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.505148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.506685 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.506737 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q74dh\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-kube-api-access-q74dh\") pod \"rabbitmq-server-0\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.578804 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579416 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579510 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579565 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/517a2efb-7c9f-4c93-876b-5962da604ef8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/517a2efb-7c9f-4c93-876b-5962da604ef8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.579751 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqwl4\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-kube-api-access-hqwl4\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681437 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/517a2efb-7c9f-4c93-876b-5962da604ef8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqwl4\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-kube-api-access-hqwl4\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.681906 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/517a2efb-7c9f-4c93-876b-5962da604ef8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.682097 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.682339 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.682856 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.683309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.683526 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.683981 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.687350 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.687470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/517a2efb-7c9f-4c93-876b-5962da604ef8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.696692 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.696858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/517a2efb-7c9f-4c93-876b-5962da604ef8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.698052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqwl4\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-kube-api-access-hqwl4\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.701932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:51 crc kubenswrapper[4707]: I1127 16:18:51.861990 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:18:52 crc kubenswrapper[4707]: I1127 16:18:52.122148 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" event={"ID":"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f","Type":"ContainerStarted","Data":"92e0ca44752d223d3b01a36323dc4c4cdf5836fd6b0e354c503a2674fb7c3af9"} Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.175073 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.176783 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.179939 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.182024 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.182110 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8lkw2" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.182734 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.184360 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.188656 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.313228 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/84e88e9e-3edb-45cd-9973-1447587f7adc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.313300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-config-data-default\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.313333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/84e88e9e-3edb-45cd-9973-1447587f7adc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.313454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.313518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4ppk\" (UniqueName: \"kubernetes.io/projected/84e88e9e-3edb-45cd-9973-1447587f7adc-kube-api-access-q4ppk\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.313540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.313584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-kolla-config\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.313608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e88e9e-3edb-45cd-9973-1447587f7adc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.415223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-kolla-config\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.415723 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e88e9e-3edb-45cd-9973-1447587f7adc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.415759 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/84e88e9e-3edb-45cd-9973-1447587f7adc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.415804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-config-data-default\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.415833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/84e88e9e-3edb-45cd-9973-1447587f7adc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.415885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.415957 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4ppk\" (UniqueName: \"kubernetes.io/projected/84e88e9e-3edb-45cd-9973-1447587f7adc-kube-api-access-q4ppk\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.415981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.417012 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.418505 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/84e88e9e-3edb-45cd-9973-1447587f7adc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.418675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-config-data-default\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.418923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.422931 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/84e88e9e-3edb-45cd-9973-1447587f7adc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.423960 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84e88e9e-3edb-45cd-9973-1447587f7adc-kolla-config\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.429990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e88e9e-3edb-45cd-9973-1447587f7adc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.450215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4ppk\" (UniqueName: \"kubernetes.io/projected/84e88e9e-3edb-45cd-9973-1447587f7adc-kube-api-access-q4ppk\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.452082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"84e88e9e-3edb-45cd-9973-1447587f7adc\") " pod="openstack/openstack-galera-0" Nov 27 16:18:53 crc kubenswrapper[4707]: I1127 16:18:53.514848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.703087 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.705644 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.710848 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-lqd45" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.711076 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.711199 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.713542 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.720514 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.840998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.841525 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b42d58-27cb-455f-9994-ae15f433e008-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.841620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.841699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.841779 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.841886 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4b42d58-27cb-455f-9994-ae15f433e008-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.841977 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv94l\" (UniqueName: \"kubernetes.io/projected/e4b42d58-27cb-455f-9994-ae15f433e008-kube-api-access-mv94l\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.842063 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b42d58-27cb-455f-9994-ae15f433e008-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.928600 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.930056 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.934914 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.935191 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-452bl" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.935363 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943024 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943352 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv94l\" (UniqueName: \"kubernetes.io/projected/e4b42d58-27cb-455f-9994-ae15f433e008-kube-api-access-mv94l\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b42d58-27cb-455f-9994-ae15f433e008-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b42d58-27cb-455f-9994-ae15f433e008-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943581 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.943652 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4b42d58-27cb-455f-9994-ae15f433e008-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.944220 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4b42d58-27cb-455f-9994-ae15f433e008-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.947092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.947657 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.949301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.949671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b42d58-27cb-455f-9994-ae15f433e008-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.963948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv94l\" (UniqueName: \"kubernetes.io/projected/e4b42d58-27cb-455f-9994-ae15f433e008-kube-api-access-mv94l\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.969546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4b42d58-27cb-455f-9994-ae15f433e008-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.985512 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b42d58-27cb-455f-9994-ae15f433e008-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:54 crc kubenswrapper[4707]: I1127 16:18:54.999515 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4b42d58-27cb-455f-9994-ae15f433e008\") " pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.030031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.045580 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj5tj\" (UniqueName: \"kubernetes.io/projected/c5402fa0-f1b7-4561-95f0-cb690caf9b58-kube-api-access-rj5tj\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.045645 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5402fa0-f1b7-4561-95f0-cb690caf9b58-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.045691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5402fa0-f1b7-4561-95f0-cb690caf9b58-config-data\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.045739 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5402fa0-f1b7-4561-95f0-cb690caf9b58-kolla-config\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.045762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5402fa0-f1b7-4561-95f0-cb690caf9b58-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.146999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5402fa0-f1b7-4561-95f0-cb690caf9b58-kolla-config\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.147249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5402fa0-f1b7-4561-95f0-cb690caf9b58-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.147365 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj5tj\" (UniqueName: \"kubernetes.io/projected/c5402fa0-f1b7-4561-95f0-cb690caf9b58-kube-api-access-rj5tj\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.147473 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5402fa0-f1b7-4561-95f0-cb690caf9b58-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.147564 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5402fa0-f1b7-4561-95f0-cb690caf9b58-config-data\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.147873 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c5402fa0-f1b7-4561-95f0-cb690caf9b58-kolla-config\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.148301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5402fa0-f1b7-4561-95f0-cb690caf9b58-config-data\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.152813 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5402fa0-f1b7-4561-95f0-cb690caf9b58-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.152935 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5402fa0-f1b7-4561-95f0-cb690caf9b58-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.171747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj5tj\" (UniqueName: \"kubernetes.io/projected/c5402fa0-f1b7-4561-95f0-cb690caf9b58-kube-api-access-rj5tj\") pod \"memcached-0\" (UID: \"c5402fa0-f1b7-4561-95f0-cb690caf9b58\") " pod="openstack/memcached-0" Nov 27 16:18:55 crc kubenswrapper[4707]: I1127 16:18:55.269491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 27 16:18:56 crc kubenswrapper[4707]: I1127 16:18:56.839801 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:18:56 crc kubenswrapper[4707]: I1127 16:18:56.840924 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 16:18:56 crc kubenswrapper[4707]: I1127 16:18:56.843190 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jnxrh" Nov 27 16:18:56 crc kubenswrapper[4707]: I1127 16:18:56.852086 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:18:56 crc kubenswrapper[4707]: I1127 16:18:56.989279 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjwd\" (UniqueName: \"kubernetes.io/projected/da629d4e-1b93-4623-a577-419c6dca0f14-kube-api-access-8cjwd\") pod \"kube-state-metrics-0\" (UID: \"da629d4e-1b93-4623-a577-419c6dca0f14\") " pod="openstack/kube-state-metrics-0" Nov 27 16:18:57 crc kubenswrapper[4707]: I1127 16:18:57.091043 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjwd\" (UniqueName: \"kubernetes.io/projected/da629d4e-1b93-4623-a577-419c6dca0f14-kube-api-access-8cjwd\") pod \"kube-state-metrics-0\" (UID: \"da629d4e-1b93-4623-a577-419c6dca0f14\") " pod="openstack/kube-state-metrics-0" Nov 27 16:18:57 crc kubenswrapper[4707]: I1127 16:18:57.111390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjwd\" (UniqueName: \"kubernetes.io/projected/da629d4e-1b93-4623-a577-419c6dca0f14-kube-api-access-8cjwd\") pod \"kube-state-metrics-0\" (UID: \"da629d4e-1b93-4623-a577-419c6dca0f14\") " pod="openstack/kube-state-metrics-0" Nov 27 16:18:57 crc kubenswrapper[4707]: I1127 16:18:57.204567 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.633891 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vvkr6"] Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.635122 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.636727 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-bl2b5" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.637077 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.637312 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.649172 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vvkr6"] Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.694874 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7n9c2"] Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.696428 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.707182 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7n9c2"] Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.781274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9639769b-4439-4ffc-b88b-cba953013bff-combined-ca-bundle\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.781546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-run-ovn\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.781716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-run\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.781838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9639769b-4439-4ffc-b88b-cba953013bff-scripts\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.781904 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-log\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.781950 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-run\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.782012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-log-ovn\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.782042 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-etc-ovs\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.782110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgscf\" (UniqueName: \"kubernetes.io/projected/9639769b-4439-4ffc-b88b-cba953013bff-kube-api-access-qgscf\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.782184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/720e549c-1f41-4fb6-b29f-465ac7e174e3-scripts\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.782204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw6s8\" (UniqueName: \"kubernetes.io/projected/720e549c-1f41-4fb6-b29f-465ac7e174e3-kube-api-access-tw6s8\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.782253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9639769b-4439-4ffc-b88b-cba953013bff-ovn-controller-tls-certs\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.782337 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-lib\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9639769b-4439-4ffc-b88b-cba953013bff-ovn-controller-tls-certs\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883407 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-lib\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883482 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9639769b-4439-4ffc-b88b-cba953013bff-combined-ca-bundle\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883503 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-run-ovn\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883527 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-run\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883543 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9639769b-4439-4ffc-b88b-cba953013bff-scripts\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883561 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-log\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883584 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-run\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883609 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-log-ovn\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-etc-ovs\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgscf\" (UniqueName: \"kubernetes.io/projected/9639769b-4439-4ffc-b88b-cba953013bff-kube-api-access-qgscf\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883681 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/720e549c-1f41-4fb6-b29f-465ac7e174e3-scripts\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw6s8\" (UniqueName: \"kubernetes.io/projected/720e549c-1f41-4fb6-b29f-465ac7e174e3-kube-api-access-tw6s8\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.883958 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-lib\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.884039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-run-ovn\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.884209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-run\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.884244 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-etc-ovs\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.884297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-run\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.884461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9639769b-4439-4ffc-b88b-cba953013bff-var-log-ovn\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.884592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/720e549c-1f41-4fb6-b29f-465ac7e174e3-var-log\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.886816 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/720e549c-1f41-4fb6-b29f-465ac7e174e3-scripts\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.887229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9639769b-4439-4ffc-b88b-cba953013bff-scripts\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.897579 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9639769b-4439-4ffc-b88b-cba953013bff-combined-ca-bundle\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.897876 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9639769b-4439-4ffc-b88b-cba953013bff-ovn-controller-tls-certs\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.901937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgscf\" (UniqueName: \"kubernetes.io/projected/9639769b-4439-4ffc-b88b-cba953013bff-kube-api-access-qgscf\") pod \"ovn-controller-vvkr6\" (UID: \"9639769b-4439-4ffc-b88b-cba953013bff\") " pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.913638 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw6s8\" (UniqueName: \"kubernetes.io/projected/720e549c-1f41-4fb6-b29f-465ac7e174e3-kube-api-access-tw6s8\") pod \"ovn-controller-ovs-7n9c2\" (UID: \"720e549c-1f41-4fb6-b29f-465ac7e174e3\") " pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:00 crc kubenswrapper[4707]: I1127 16:19:00.984901 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.012083 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.079303 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.081424 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.083952 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.083992 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fg2t9" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.083961 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.084283 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.084415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.089912 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.187475 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.187537 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.187591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.187702 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzq7m\" (UniqueName: \"kubernetes.io/projected/379e0975-7a52-4f96-b931-4c02377d6537-kube-api-access-qzq7m\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.187743 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.187801 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/379e0975-7a52-4f96-b931-4c02377d6537-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.187844 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379e0975-7a52-4f96-b931-4c02377d6537-config\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.187876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379e0975-7a52-4f96-b931-4c02377d6537-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.289665 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/379e0975-7a52-4f96-b931-4c02377d6537-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.289753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379e0975-7a52-4f96-b931-4c02377d6537-config\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.289821 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379e0975-7a52-4f96-b931-4c02377d6537-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.290261 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/379e0975-7a52-4f96-b931-4c02377d6537-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.290878 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379e0975-7a52-4f96-b931-4c02377d6537-config\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.290939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.290977 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.291061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.291120 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzq7m\" (UniqueName: \"kubernetes.io/projected/379e0975-7a52-4f96-b931-4c02377d6537-kube-api-access-qzq7m\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.291150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.291263 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379e0975-7a52-4f96-b931-4c02377d6537-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.291944 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.297057 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.297344 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.299529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/379e0975-7a52-4f96-b931-4c02377d6537-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.319533 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzq7m\" (UniqueName: \"kubernetes.io/projected/379e0975-7a52-4f96-b931-4c02377d6537-kube-api-access-qzq7m\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.323661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"379e0975-7a52-4f96-b931-4c02377d6537\") " pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:01 crc kubenswrapper[4707]: I1127 16:19:01.406958 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.623893 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.624245 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.868872 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.870544 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.872703 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tnrpm" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.872608 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.872753 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.876278 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.884502 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.947613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.947691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.947723 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.947747 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.947811 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.947916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67tpr\" (UniqueName: \"kubernetes.io/projected/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-kube-api-access-67tpr\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.947951 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:03 crc kubenswrapper[4707]: I1127 16:19:03.947979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-config\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049440 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049516 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67tpr\" (UniqueName: \"kubernetes.io/projected/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-kube-api-access-67tpr\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049542 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049574 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-config\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049663 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.049858 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.050668 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-config\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.050949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.051635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.055930 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.057632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.058297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.069525 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67tpr\" (UniqueName: \"kubernetes.io/projected/d6f4624a-1407-4ff8-bd7f-90f2a0fd6718-kube-api-access-67tpr\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.069649 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718\") " pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.211805 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:04 crc kubenswrapper[4707]: E1127 16:19:04.460744 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 27 16:19:04 crc kubenswrapper[4707]: E1127 16:19:04.460998 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mvcgn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-g4jvb_openstack(cb1028a6-b188-4015-9504-94a1d1ef7f6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:19:04 crc kubenswrapper[4707]: E1127 16:19:04.462676 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" podUID="cb1028a6-b188-4015-9504-94a1d1ef7f6d" Nov 27 16:19:04 crc kubenswrapper[4707]: E1127 16:19:04.523255 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 27 16:19:04 crc kubenswrapper[4707]: E1127 16:19:04.523562 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxrtl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-5nb8x_openstack(0cb7bf93-29b6-402c-86c7-24d3a6b1b135): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:19:04 crc kubenswrapper[4707]: E1127 16:19:04.525044 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" podUID="0cb7bf93-29b6-402c-86c7-24d3a6b1b135" Nov 27 16:19:04 crc kubenswrapper[4707]: I1127 16:19:04.996789 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:19:05 crc kubenswrapper[4707]: W1127 16:19:05.058204 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31a7b86_c43f_4123_a33d_ffba2ee3d015.slice/crio-df01efea3bad1d8159514b124f5783efe2cc1351406f89eef8a9867fc08130c6 WatchSource:0}: Error finding container df01efea3bad1d8159514b124f5783efe2cc1351406f89eef8a9867fc08130c6: Status 404 returned error can't find the container with id df01efea3bad1d8159514b124f5783efe2cc1351406f89eef8a9867fc08130c6 Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.272965 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b31a7b86-c43f-4123-a33d-ffba2ee3d015","Type":"ContainerStarted","Data":"df01efea3bad1d8159514b124f5783efe2cc1351406f89eef8a9867fc08130c6"} Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.275408 4707 generic.go:334] "Generic (PLEG): container finished" podID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" containerID="7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6" exitCode=0 Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.275553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" event={"ID":"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f","Type":"ContainerDied","Data":"7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6"} Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.278275 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" containerID="30f2f1a29e0ab7cf89e446b56c356ec4def4a7db9ee7127f65ab83ebc82a31c2" exitCode=0 Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.278469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" event={"ID":"a2e43d9c-f196-474a-ae4b-ff985c9ab550","Type":"ContainerDied","Data":"30f2f1a29e0ab7cf89e446b56c356ec4def4a7db9ee7127f65ab83ebc82a31c2"} Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.323566 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vvkr6"] Nov 27 16:19:05 crc kubenswrapper[4707]: W1127 16:19:05.329872 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9639769b_4439_4ffc_b88b_cba953013bff.slice/crio-73e1f8bf79307d072709c936a7213df69beb659c4e73995bf81641fe2a78fdce WatchSource:0}: Error finding container 73e1f8bf79307d072709c936a7213df69beb659c4e73995bf81641fe2a78fdce: Status 404 returned error can't find the container with id 73e1f8bf79307d072709c936a7213df69beb659c4e73995bf81641fe2a78fdce Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.441413 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.448217 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 27 16:19:05 crc kubenswrapper[4707]: W1127 16:19:05.451211 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5402fa0_f1b7_4561_95f0_cb690caf9b58.slice/crio-fa88d65e32f40e332d660682ab628db3521696bc1e32e9ef16dc79ebe5860413 WatchSource:0}: Error finding container fa88d65e32f40e332d660682ab628db3521696bc1e32e9ef16dc79ebe5860413: Status 404 returned error can't find the container with id fa88d65e32f40e332d660682ab628db3521696bc1e32e9ef16dc79ebe5860413 Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.455024 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.461454 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.466777 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:19:05 crc kubenswrapper[4707]: W1127 16:19:05.476522 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e88e9e_3edb_45cd_9973_1447587f7adc.slice/crio-f846ac9a7a1c7b0801053d65cd999df3c14a6c793f672c52dd8cfe90e3fe40e3 WatchSource:0}: Error finding container f846ac9a7a1c7b0801053d65cd999df3c14a6c793f672c52dd8cfe90e3fe40e3: Status 404 returned error can't find the container with id f846ac9a7a1c7b0801053d65cd999df3c14a6c793f672c52dd8cfe90e3fe40e3 Nov 27 16:19:05 crc kubenswrapper[4707]: W1127 16:19:05.484835 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4b42d58_27cb_455f_9994_ae15f433e008.slice/crio-46d95a72af5627980eabc902e847abd99f86c45e45bc6e82f6caf4221a9a8949 WatchSource:0}: Error finding container 46d95a72af5627980eabc902e847abd99f86c45e45bc6e82f6caf4221a9a8949: Status 404 returned error can't find the container with id 46d95a72af5627980eabc902e847abd99f86c45e45bc6e82f6caf4221a9a8949 Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.526994 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.564149 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.579508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-dns-svc\") pod \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.579635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvcgn\" (UniqueName: \"kubernetes.io/projected/cb1028a6-b188-4015-9504-94a1d1ef7f6d-kube-api-access-mvcgn\") pod \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.579659 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-config\") pod \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\" (UID: \"cb1028a6-b188-4015-9504-94a1d1ef7f6d\") " Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.580453 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-config" (OuterVolumeSpecName: "config") pod "cb1028a6-b188-4015-9504-94a1d1ef7f6d" (UID: "cb1028a6-b188-4015-9504-94a1d1ef7f6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.580843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb1028a6-b188-4015-9504-94a1d1ef7f6d" (UID: "cb1028a6-b188-4015-9504-94a1d1ef7f6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.585666 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb1028a6-b188-4015-9504-94a1d1ef7f6d-kube-api-access-mvcgn" (OuterVolumeSpecName: "kube-api-access-mvcgn") pod "cb1028a6-b188-4015-9504-94a1d1ef7f6d" (UID: "cb1028a6-b188-4015-9504-94a1d1ef7f6d"). InnerVolumeSpecName "kube-api-access-mvcgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.640772 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.681022 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvcgn\" (UniqueName: \"kubernetes.io/projected/cb1028a6-b188-4015-9504-94a1d1ef7f6d-kube-api-access-mvcgn\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.681047 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.681056 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb1028a6-b188-4015-9504-94a1d1ef7f6d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.747441 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.884375 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxrtl\" (UniqueName: \"kubernetes.io/projected/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-kube-api-access-hxrtl\") pod \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\" (UID: \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\") " Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.884683 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-config\") pod \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\" (UID: \"0cb7bf93-29b6-402c-86c7-24d3a6b1b135\") " Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.885024 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-config" (OuterVolumeSpecName: "config") pod "0cb7bf93-29b6-402c-86c7-24d3a6b1b135" (UID: "0cb7bf93-29b6-402c-86c7-24d3a6b1b135"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.888748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-kube-api-access-hxrtl" (OuterVolumeSpecName: "kube-api-access-hxrtl") pod "0cb7bf93-29b6-402c-86c7-24d3a6b1b135" (UID: "0cb7bf93-29b6-402c-86c7-24d3a6b1b135"). InnerVolumeSpecName "kube-api-access-hxrtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.986071 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxrtl\" (UniqueName: \"kubernetes.io/projected/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-kube-api-access-hxrtl\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:05 crc kubenswrapper[4707]: I1127 16:19:05.986110 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb7bf93-29b6-402c-86c7-24d3a6b1b135-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.235244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7n9c2"] Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.285988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718","Type":"ContainerStarted","Data":"9b23cae2b3abb658d97d9c6c94fb151949cbacc880f4cde406755667eb8924cc"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.287911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" event={"ID":"0cb7bf93-29b6-402c-86c7-24d3a6b1b135","Type":"ContainerDied","Data":"0dfacdec9c6ed930106e6073d237e486bfb76999d87ad2fbb23bb02cd7798467"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.287967 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5nb8x" Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.290388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c5402fa0-f1b7-4561-95f0-cb690caf9b58","Type":"ContainerStarted","Data":"fa88d65e32f40e332d660682ab628db3521696bc1e32e9ef16dc79ebe5860413"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.291966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84e88e9e-3edb-45cd-9973-1447587f7adc","Type":"ContainerStarted","Data":"f846ac9a7a1c7b0801053d65cd999df3c14a6c793f672c52dd8cfe90e3fe40e3"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.292931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"517a2efb-7c9f-4c93-876b-5962da604ef8","Type":"ContainerStarted","Data":"c53e6c79d3296ea4c3efd8e59f1e7a6e8cc4c6130b53b6eb4b6cb8b7c8cf5785"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.293721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vvkr6" event={"ID":"9639769b-4439-4ffc-b88b-cba953013bff","Type":"ContainerStarted","Data":"73e1f8bf79307d072709c936a7213df69beb659c4e73995bf81641fe2a78fdce"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.295784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" event={"ID":"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f","Type":"ContainerStarted","Data":"35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.296288 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.297552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"379e0975-7a52-4f96-b931-4c02377d6537","Type":"ContainerStarted","Data":"fd93f719de7727fde6344801e7122ab3ee5d846360ee24d4fe0ce4c40f6c7f8d"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.298613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"da629d4e-1b93-4623-a577-419c6dca0f14","Type":"ContainerStarted","Data":"74835a2247bf6b616df3b3f8299ac949241a86ff395a5463aeea31a66efc5660"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.300237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" event={"ID":"cb1028a6-b188-4015-9504-94a1d1ef7f6d","Type":"ContainerDied","Data":"e8b257d50607bbfb491e7f03a0ea5cb542348eeb3712e5c6a069edc3dcf76f70"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.300292 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-g4jvb" Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.305550 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" event={"ID":"a2e43d9c-f196-474a-ae4b-ff985c9ab550","Type":"ContainerStarted","Data":"b2ff30771a391017a2a5339dc9121383bd5d0d4d2b7b9711f9e09d7f0c0b48ad"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.305828 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.309834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4b42d58-27cb-455f-9994-ae15f433e008","Type":"ContainerStarted","Data":"46d95a72af5627980eabc902e847abd99f86c45e45bc6e82f6caf4221a9a8949"} Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.316682 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" podStartSLOduration=2.860476543 podStartE2EDuration="16.316662521s" podCreationTimestamp="2025-11-27 16:18:50 +0000 UTC" firstStartedPulling="2025-11-27 16:18:51.198748405 +0000 UTC m=+906.830197173" lastFinishedPulling="2025-11-27 16:19:04.654934343 +0000 UTC m=+920.286383151" observedRunningTime="2025-11-27 16:19:06.310276365 +0000 UTC m=+921.941725133" watchObservedRunningTime="2025-11-27 16:19:06.316662521 +0000 UTC m=+921.948111289" Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.327871 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" podStartSLOduration=2.610815304 podStartE2EDuration="16.327859203s" podCreationTimestamp="2025-11-27 16:18:50 +0000 UTC" firstStartedPulling="2025-11-27 16:18:50.919814624 +0000 UTC m=+906.551263392" lastFinishedPulling="2025-11-27 16:19:04.636858523 +0000 UTC m=+920.268307291" observedRunningTime="2025-11-27 16:19:06.325493225 +0000 UTC m=+921.956941993" watchObservedRunningTime="2025-11-27 16:19:06.327859203 +0000 UTC m=+921.959307961" Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.369751 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nb8x"] Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.375948 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nb8x"] Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.390122 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g4jvb"] Nov 27 16:19:06 crc kubenswrapper[4707]: I1127 16:19:06.396505 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-g4jvb"] Nov 27 16:19:06 crc kubenswrapper[4707]: W1127 16:19:06.734159 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod720e549c_1f41_4fb6_b29f_465ac7e174e3.slice/crio-b5a2695adcfa9d7552e453fc4e8c7e7f32ef99563b40fafafcd26d045fd964f5 WatchSource:0}: Error finding container b5a2695adcfa9d7552e453fc4e8c7e7f32ef99563b40fafafcd26d045fd964f5: Status 404 returned error can't find the container with id b5a2695adcfa9d7552e453fc4e8c7e7f32ef99563b40fafafcd26d045fd964f5 Nov 27 16:19:07 crc kubenswrapper[4707]: I1127 16:19:07.216007 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb7bf93-29b6-402c-86c7-24d3a6b1b135" path="/var/lib/kubelet/pods/0cb7bf93-29b6-402c-86c7-24d3a6b1b135/volumes" Nov 27 16:19:07 crc kubenswrapper[4707]: I1127 16:19:07.216746 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb1028a6-b188-4015-9504-94a1d1ef7f6d" path="/var/lib/kubelet/pods/cb1028a6-b188-4015-9504-94a1d1ef7f6d/volumes" Nov 27 16:19:07 crc kubenswrapper[4707]: I1127 16:19:07.319679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7n9c2" event={"ID":"720e549c-1f41-4fb6-b29f-465ac7e174e3","Type":"ContainerStarted","Data":"b5a2695adcfa9d7552e453fc4e8c7e7f32ef99563b40fafafcd26d045fd964f5"} Nov 27 16:19:10 crc kubenswrapper[4707]: I1127 16:19:10.443579 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:19:10 crc kubenswrapper[4707]: I1127 16:19:10.729499 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:19:10 crc kubenswrapper[4707]: I1127 16:19:10.799082 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wzqgl"] Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.076105 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qcjdd"] Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.077641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.106511 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qcjdd"] Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.184912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-catalog-content\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.184962 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-utilities\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.185004 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9dbs\" (UniqueName: \"kubernetes.io/projected/ce5bfb39-c7b0-4317-b53f-5f4280341da4-kube-api-access-c9dbs\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.286774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-catalog-content\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.286829 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-utilities\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.286874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9dbs\" (UniqueName: \"kubernetes.io/projected/ce5bfb39-c7b0-4317-b53f-5f4280341da4-kube-api-access-c9dbs\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.287274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-utilities\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.287722 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-catalog-content\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.330224 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9dbs\" (UniqueName: \"kubernetes.io/projected/ce5bfb39-c7b0-4317-b53f-5f4280341da4-kube-api-access-c9dbs\") pod \"certified-operators-qcjdd\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.351871 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" podUID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" containerName="dnsmasq-dns" containerID="cri-o://b2ff30771a391017a2a5339dc9121383bd5d0d4d2b7b9711f9e09d7f0c0b48ad" gracePeriod=10 Nov 27 16:19:11 crc kubenswrapper[4707]: I1127 16:19:11.401062 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:12 crc kubenswrapper[4707]: I1127 16:19:12.361250 4707 generic.go:334] "Generic (PLEG): container finished" podID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" containerID="b2ff30771a391017a2a5339dc9121383bd5d0d4d2b7b9711f9e09d7f0c0b48ad" exitCode=0 Nov 27 16:19:12 crc kubenswrapper[4707]: I1127 16:19:12.361287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" event={"ID":"a2e43d9c-f196-474a-ae4b-ff985c9ab550","Type":"ContainerDied","Data":"b2ff30771a391017a2a5339dc9121383bd5d0d4d2b7b9711f9e09d7f0c0b48ad"} Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.736046 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.866032 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-dns-svc\") pod \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.866266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-config\") pod \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.866358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbs75\" (UniqueName: \"kubernetes.io/projected/a2e43d9c-f196-474a-ae4b-ff985c9ab550-kube-api-access-bbs75\") pod \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\" (UID: \"a2e43d9c-f196-474a-ae4b-ff985c9ab550\") " Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.870556 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e43d9c-f196-474a-ae4b-ff985c9ab550-kube-api-access-bbs75" (OuterVolumeSpecName: "kube-api-access-bbs75") pod "a2e43d9c-f196-474a-ae4b-ff985c9ab550" (UID: "a2e43d9c-f196-474a-ae4b-ff985c9ab550"). InnerVolumeSpecName "kube-api-access-bbs75". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.922090 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-config" (OuterVolumeSpecName: "config") pod "a2e43d9c-f196-474a-ae4b-ff985c9ab550" (UID: "a2e43d9c-f196-474a-ae4b-ff985c9ab550"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.930229 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2e43d9c-f196-474a-ae4b-ff985c9ab550" (UID: "a2e43d9c-f196-474a-ae4b-ff985c9ab550"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.968573 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.968598 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbs75\" (UniqueName: \"kubernetes.io/projected/a2e43d9c-f196-474a-ae4b-ff985c9ab550-kube-api-access-bbs75\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:14 crc kubenswrapper[4707]: I1127 16:19:14.968609 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2e43d9c-f196-474a-ae4b-ff985c9ab550-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:15 crc kubenswrapper[4707]: I1127 16:19:15.269873 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qcjdd"] Nov 27 16:19:15 crc kubenswrapper[4707]: I1127 16:19:15.393889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" event={"ID":"a2e43d9c-f196-474a-ae4b-ff985c9ab550","Type":"ContainerDied","Data":"508e86d453a5a96ebfba51d7f7b2aadeeb94f10aa7a775ab7002c5ef41ad53e5"} Nov 27 16:19:15 crc kubenswrapper[4707]: I1127 16:19:15.393947 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-wzqgl" Nov 27 16:19:15 crc kubenswrapper[4707]: I1127 16:19:15.393959 4707 scope.go:117] "RemoveContainer" containerID="b2ff30771a391017a2a5339dc9121383bd5d0d4d2b7b9711f9e09d7f0c0b48ad" Nov 27 16:19:15 crc kubenswrapper[4707]: I1127 16:19:15.413568 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wzqgl"] Nov 27 16:19:15 crc kubenswrapper[4707]: I1127 16:19:15.420718 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-wzqgl"] Nov 27 16:19:15 crc kubenswrapper[4707]: I1127 16:19:15.639952 4707 scope.go:117] "RemoveContainer" containerID="30f2f1a29e0ab7cf89e446b56c356ec4def4a7db9ee7127f65ab83ebc82a31c2" Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.412273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84e88e9e-3edb-45cd-9973-1447587f7adc","Type":"ContainerStarted","Data":"6a491e7ad158a98db2fce6b9eb91f7af17ac5850b78f153877f9a12b9804f0a6"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.415744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4b42d58-27cb-455f-9994-ae15f433e008","Type":"ContainerStarted","Data":"17fd7c42e1ec8078b3e09afb56c727968eb4d1ad0fb21d7b759d076258ed0524"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.419145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"da629d4e-1b93-4623-a577-419c6dca0f14","Type":"ContainerStarted","Data":"02800a9fdfba6d6e27d3309db4a1bf5af39a66efb48603760535b9c096826729"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.419362 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.422044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vvkr6" event={"ID":"9639769b-4439-4ffc-b88b-cba953013bff","Type":"ContainerStarted","Data":"efc1e68c6c209577d7f36c8053b7c6820abfa6efc69b4f9853ff38c1238b56df"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.422174 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vvkr6" Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.425996 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerID="45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273" exitCode=0 Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.426083 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcjdd" event={"ID":"ce5bfb39-c7b0-4317-b53f-5f4280341da4","Type":"ContainerDied","Data":"45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.426145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcjdd" event={"ID":"ce5bfb39-c7b0-4317-b53f-5f4280341da4","Type":"ContainerStarted","Data":"77d0acfa0bcdf6a8c71031139cffecc45e5582836eabaa69b55a6615b2268740"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.428192 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718","Type":"ContainerStarted","Data":"a737df8021502cb12508bfad6eaa54eef753dd7aed20059520af77f6dc0c694b"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.432302 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7n9c2" event={"ID":"720e549c-1f41-4fb6-b29f-465ac7e174e3","Type":"ContainerStarted","Data":"16679cb9c00ad3a3470182c253075cf3990e15bd6073e0a542b998f5a037e1ed"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.433863 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"379e0975-7a52-4f96-b931-4c02377d6537","Type":"ContainerStarted","Data":"2cd3ad0ee054cb43bec83cb0c1fcc3602e14609e20ed3f87e35d8e7003891af5"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.435872 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c5402fa0-f1b7-4561-95f0-cb690caf9b58","Type":"ContainerStarted","Data":"242aa7112cde08876465f33b2f6184f253b40581145623e526b22d5bb9403e92"} Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.436102 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.524332 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vvkr6" podStartSLOduration=7.129593432 podStartE2EDuration="16.524311595s" podCreationTimestamp="2025-11-27 16:19:00 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.344827644 +0000 UTC m=+920.976276412" lastFinishedPulling="2025-11-27 16:19:14.739545787 +0000 UTC m=+930.370994575" observedRunningTime="2025-11-27 16:19:16.52080302 +0000 UTC m=+932.152251788" watchObservedRunningTime="2025-11-27 16:19:16.524311595 +0000 UTC m=+932.155760363" Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.537146 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.195251158 podStartE2EDuration="20.537127726s" podCreationTimestamp="2025-11-27 16:18:56 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.468140402 +0000 UTC m=+921.099589170" lastFinishedPulling="2025-11-27 16:19:15.81001698 +0000 UTC m=+931.441465738" observedRunningTime="2025-11-27 16:19:16.533001926 +0000 UTC m=+932.164450694" watchObservedRunningTime="2025-11-27 16:19:16.537127726 +0000 UTC m=+932.168576494" Nov 27 16:19:16 crc kubenswrapper[4707]: I1127 16:19:16.553802 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.324122582 podStartE2EDuration="22.553785261s" podCreationTimestamp="2025-11-27 16:18:54 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.453551638 +0000 UTC m=+921.085000406" lastFinishedPulling="2025-11-27 16:19:14.683214317 +0000 UTC m=+930.314663085" observedRunningTime="2025-11-27 16:19:16.546549655 +0000 UTC m=+932.177998423" watchObservedRunningTime="2025-11-27 16:19:16.553785261 +0000 UTC m=+932.185234029" Nov 27 16:19:17 crc kubenswrapper[4707]: I1127 16:19:17.208270 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" path="/var/lib/kubelet/pods/a2e43d9c-f196-474a-ae4b-ff985c9ab550/volumes" Nov 27 16:19:17 crc kubenswrapper[4707]: I1127 16:19:17.446388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcjdd" event={"ID":"ce5bfb39-c7b0-4317-b53f-5f4280341da4","Type":"ContainerStarted","Data":"85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13"} Nov 27 16:19:17 crc kubenswrapper[4707]: I1127 16:19:17.450866 4707 generic.go:334] "Generic (PLEG): container finished" podID="720e549c-1f41-4fb6-b29f-465ac7e174e3" containerID="16679cb9c00ad3a3470182c253075cf3990e15bd6073e0a542b998f5a037e1ed" exitCode=0 Nov 27 16:19:17 crc kubenswrapper[4707]: I1127 16:19:17.450938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7n9c2" event={"ID":"720e549c-1f41-4fb6-b29f-465ac7e174e3","Type":"ContainerDied","Data":"16679cb9c00ad3a3470182c253075cf3990e15bd6073e0a542b998f5a037e1ed"} Nov 27 16:19:17 crc kubenswrapper[4707]: I1127 16:19:17.460287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"517a2efb-7c9f-4c93-876b-5962da604ef8","Type":"ContainerStarted","Data":"df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60"} Nov 27 16:19:17 crc kubenswrapper[4707]: I1127 16:19:17.465107 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b31a7b86-c43f-4123-a33d-ffba2ee3d015","Type":"ContainerStarted","Data":"422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e"} Nov 27 16:19:18 crc kubenswrapper[4707]: I1127 16:19:18.480451 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerID="85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13" exitCode=0 Nov 27 16:19:18 crc kubenswrapper[4707]: I1127 16:19:18.480661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcjdd" event={"ID":"ce5bfb39-c7b0-4317-b53f-5f4280341da4","Type":"ContainerDied","Data":"85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13"} Nov 27 16:19:18 crc kubenswrapper[4707]: I1127 16:19:18.483705 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7n9c2" event={"ID":"720e549c-1f41-4fb6-b29f-465ac7e174e3","Type":"ContainerStarted","Data":"4368a594b2e46765d6f4419bb156c2f05ae235f0fe3fcdfc9418e5da349f35c3"} Nov 27 16:19:20 crc kubenswrapper[4707]: I1127 16:19:20.271704 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 27 16:19:22 crc kubenswrapper[4707]: I1127 16:19:22.525270 4707 generic.go:334] "Generic (PLEG): container finished" podID="84e88e9e-3edb-45cd-9973-1447587f7adc" containerID="6a491e7ad158a98db2fce6b9eb91f7af17ac5850b78f153877f9a12b9804f0a6" exitCode=0 Nov 27 16:19:22 crc kubenswrapper[4707]: I1127 16:19:22.525407 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84e88e9e-3edb-45cd-9973-1447587f7adc","Type":"ContainerDied","Data":"6a491e7ad158a98db2fce6b9eb91f7af17ac5850b78f153877f9a12b9804f0a6"} Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.535851 4707 generic.go:334] "Generic (PLEG): container finished" podID="e4b42d58-27cb-455f-9994-ae15f433e008" containerID="17fd7c42e1ec8078b3e09afb56c727968eb4d1ad0fb21d7b759d076258ed0524" exitCode=0 Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.535902 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4b42d58-27cb-455f-9994-ae15f433e008","Type":"ContainerDied","Data":"17fd7c42e1ec8078b3e09afb56c727968eb4d1ad0fb21d7b759d076258ed0524"} Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.920941 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9br7v"] Nov 27 16:19:23 crc kubenswrapper[4707]: E1127 16:19:23.921269 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" containerName="init" Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.921288 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" containerName="init" Nov 27 16:19:23 crc kubenswrapper[4707]: E1127 16:19:23.921326 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" containerName="dnsmasq-dns" Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.921335 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" containerName="dnsmasq-dns" Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.921570 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e43d9c-f196-474a-ae4b-ff985c9ab550" containerName="dnsmasq-dns" Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.922135 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.924852 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 27 16:19:23 crc kubenswrapper[4707]: I1127 16:19:23.943524 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9br7v"] Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.036072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d951ce68-e4f2-4ead-aaef-b264f721d7a3-config\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.036168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d951ce68-e4f2-4ead-aaef-b264f721d7a3-ovs-rundir\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.036237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d951ce68-e4f2-4ead-aaef-b264f721d7a3-combined-ca-bundle\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.036265 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d951ce68-e4f2-4ead-aaef-b264f721d7a3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.036318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kpzn\" (UniqueName: \"kubernetes.io/projected/d951ce68-e4f2-4ead-aaef-b264f721d7a3-kube-api-access-7kpzn\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.036365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d951ce68-e4f2-4ead-aaef-b264f721d7a3-ovn-rundir\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.092512 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lk5vv"] Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.093722 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.096581 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.103689 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lk5vv"] Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.138061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d951ce68-e4f2-4ead-aaef-b264f721d7a3-ovn-rundir\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.138118 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d951ce68-e4f2-4ead-aaef-b264f721d7a3-config\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.138156 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d951ce68-e4f2-4ead-aaef-b264f721d7a3-ovs-rundir\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.138206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d951ce68-e4f2-4ead-aaef-b264f721d7a3-combined-ca-bundle\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.138224 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d951ce68-e4f2-4ead-aaef-b264f721d7a3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.138257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kpzn\" (UniqueName: \"kubernetes.io/projected/d951ce68-e4f2-4ead-aaef-b264f721d7a3-kube-api-access-7kpzn\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.138745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d951ce68-e4f2-4ead-aaef-b264f721d7a3-ovs-rundir\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.138753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d951ce68-e4f2-4ead-aaef-b264f721d7a3-ovn-rundir\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.139258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d951ce68-e4f2-4ead-aaef-b264f721d7a3-config\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.145394 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d951ce68-e4f2-4ead-aaef-b264f721d7a3-combined-ca-bundle\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.146157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d951ce68-e4f2-4ead-aaef-b264f721d7a3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.172005 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kpzn\" (UniqueName: \"kubernetes.io/projected/d951ce68-e4f2-4ead-aaef-b264f721d7a3-kube-api-access-7kpzn\") pod \"ovn-controller-metrics-9br7v\" (UID: \"d951ce68-e4f2-4ead-aaef-b264f721d7a3\") " pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.240010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.240056 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.240103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55s6n\" (UniqueName: \"kubernetes.io/projected/73383544-93a5-426e-9d0a-f7626b876bb3-kube-api-access-55s6n\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.240316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-config\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.244717 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9br7v" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.249726 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lk5vv"] Nov 27 16:19:24 crc kubenswrapper[4707]: E1127 16:19:24.250332 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-55s6n ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" podUID="73383544-93a5-426e-9d0a-f7626b876bb3" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.278089 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2z6f"] Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.279214 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.282341 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.287111 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2z6f"] Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.342427 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.342694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.343170 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.343328 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.343705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55s6n\" (UniqueName: \"kubernetes.io/projected/73383544-93a5-426e-9d0a-f7626b876bb3-kube-api-access-55s6n\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.343803 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-config\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.344352 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-config\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.358884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55s6n\" (UniqueName: \"kubernetes.io/projected/73383544-93a5-426e-9d0a-f7626b876bb3-kube-api-access-55s6n\") pod \"dnsmasq-dns-7fd796d7df-lk5vv\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.445400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.445483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7ww\" (UniqueName: \"kubernetes.io/projected/8782247a-1ddc-4e6e-9042-8c7ade00b0de-kube-api-access-xc7ww\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.445548 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.445711 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.445882 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-config\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.541855 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.546936 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.546986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7ww\" (UniqueName: \"kubernetes.io/projected/8782247a-1ddc-4e6e-9042-8c7ade00b0de-kube-api-access-xc7ww\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.547019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.547053 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.547092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-config\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.547934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.547954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-config\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.548096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.549022 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.551097 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.569103 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7ww\" (UniqueName: \"kubernetes.io/projected/8782247a-1ddc-4e6e-9042-8c7ade00b0de-kube-api-access-xc7ww\") pod \"dnsmasq-dns-86db49b7ff-k2z6f\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.596325 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.648986 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-config\") pod \"73383544-93a5-426e-9d0a-f7626b876bb3\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.649069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-dns-svc\") pod \"73383544-93a5-426e-9d0a-f7626b876bb3\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.649674 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55s6n\" (UniqueName: \"kubernetes.io/projected/73383544-93a5-426e-9d0a-f7626b876bb3-kube-api-access-55s6n\") pod \"73383544-93a5-426e-9d0a-f7626b876bb3\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.649974 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-ovsdbserver-nb\") pod \"73383544-93a5-426e-9d0a-f7626b876bb3\" (UID: \"73383544-93a5-426e-9d0a-f7626b876bb3\") " Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.649612 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73383544-93a5-426e-9d0a-f7626b876bb3" (UID: "73383544-93a5-426e-9d0a-f7626b876bb3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.649716 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-config" (OuterVolumeSpecName: "config") pod "73383544-93a5-426e-9d0a-f7626b876bb3" (UID: "73383544-93a5-426e-9d0a-f7626b876bb3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.650487 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.650509 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.650564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73383544-93a5-426e-9d0a-f7626b876bb3" (UID: "73383544-93a5-426e-9d0a-f7626b876bb3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.653873 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73383544-93a5-426e-9d0a-f7626b876bb3-kube-api-access-55s6n" (OuterVolumeSpecName: "kube-api-access-55s6n") pod "73383544-93a5-426e-9d0a-f7626b876bb3" (UID: "73383544-93a5-426e-9d0a-f7626b876bb3"). InnerVolumeSpecName "kube-api-access-55s6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.752991 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55s6n\" (UniqueName: \"kubernetes.io/projected/73383544-93a5-426e-9d0a-f7626b876bb3-kube-api-access-55s6n\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:24 crc kubenswrapper[4707]: I1127 16:19:24.753047 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73383544-93a5-426e-9d0a-f7626b876bb3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:25 crc kubenswrapper[4707]: I1127 16:19:25.550096 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-lk5vv" Nov 27 16:19:25 crc kubenswrapper[4707]: I1127 16:19:25.608098 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lk5vv"] Nov 27 16:19:25 crc kubenswrapper[4707]: I1127 16:19:25.624839 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-lk5vv"] Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.559457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4b42d58-27cb-455f-9994-ae15f433e008","Type":"ContainerStarted","Data":"8c3e64af97a98dfd26b5e07b0903f91490ce06f60fa28087e05cff14f3f77974"} Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.563283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcjdd" event={"ID":"ce5bfb39-c7b0-4317-b53f-5f4280341da4","Type":"ContainerStarted","Data":"caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7"} Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.565469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d6f4624a-1407-4ff8-bd7f-90f2a0fd6718","Type":"ContainerStarted","Data":"29f7481fc9ad6df07892cc17ca6160720ac0841b48584fa18010626b7ee8e1fa"} Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.570691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7n9c2" event={"ID":"720e549c-1f41-4fb6-b29f-465ac7e174e3","Type":"ContainerStarted","Data":"27e1e4ed92ff894cfe9ca6ac4804b330ce98d15e5161f5941d7af8773b09b80c"} Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.570741 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.570782 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.571239 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9br7v"] Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.573447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"379e0975-7a52-4f96-b931-4c02377d6537","Type":"ContainerStarted","Data":"7588f0d30359d4c7894c5ff2fc3eaa562fde948aa38ab0c4f016440d221954ec"} Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.575463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84e88e9e-3edb-45cd-9973-1447587f7adc","Type":"ContainerStarted","Data":"5a2b46c0bbb1308a557c0530d5a0c89986a6ff5bcc2d5d488ef5af0033a4dc82"} Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.578617 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2z6f"] Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.610819 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.773252486 podStartE2EDuration="34.610793653s" podCreationTimestamp="2025-11-27 16:18:52 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.478661308 +0000 UTC m=+921.110110076" lastFinishedPulling="2025-11-27 16:19:14.316202475 +0000 UTC m=+929.947651243" observedRunningTime="2025-11-27 16:19:26.6061233 +0000 UTC m=+942.237572088" watchObservedRunningTime="2025-11-27 16:19:26.610793653 +0000 UTC m=+942.242242451" Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.614275 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.02970654 podStartE2EDuration="33.614262318s" podCreationTimestamp="2025-11-27 16:18:53 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.48738744 +0000 UTC m=+921.118836208" lastFinishedPulling="2025-11-27 16:19:15.071943218 +0000 UTC m=+930.703391986" observedRunningTime="2025-11-27 16:19:26.587300122 +0000 UTC m=+942.218748880" watchObservedRunningTime="2025-11-27 16:19:26.614262318 +0000 UTC m=+942.245711126" Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.642037 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7n9c2" podStartSLOduration=18.654615003 podStartE2EDuration="26.642016862s" podCreationTimestamp="2025-11-27 16:19:00 +0000 UTC" firstStartedPulling="2025-11-27 16:19:06.739205223 +0000 UTC m=+922.370653991" lastFinishedPulling="2025-11-27 16:19:14.726607072 +0000 UTC m=+930.358055850" observedRunningTime="2025-11-27 16:19:26.631849785 +0000 UTC m=+942.263298553" watchObservedRunningTime="2025-11-27 16:19:26.642016862 +0000 UTC m=+942.273465630" Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.663237 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qcjdd" podStartSLOduration=5.750471473 podStartE2EDuration="15.663219888s" podCreationTimestamp="2025-11-27 16:19:11 +0000 UTC" firstStartedPulling="2025-11-27 16:19:16.428192438 +0000 UTC m=+932.059641206" lastFinishedPulling="2025-11-27 16:19:26.340940853 +0000 UTC m=+941.972389621" observedRunningTime="2025-11-27 16:19:26.655243614 +0000 UTC m=+942.286692402" watchObservedRunningTime="2025-11-27 16:19:26.663219888 +0000 UTC m=+942.294668656" Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.683307 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.04270547 podStartE2EDuration="26.683288766s" podCreationTimestamp="2025-11-27 16:19:00 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.585245239 +0000 UTC m=+921.216694007" lastFinishedPulling="2025-11-27 16:19:26.225828535 +0000 UTC m=+941.857277303" observedRunningTime="2025-11-27 16:19:26.675752033 +0000 UTC m=+942.307200801" watchObservedRunningTime="2025-11-27 16:19:26.683288766 +0000 UTC m=+942.314737534" Nov 27 16:19:26 crc kubenswrapper[4707]: I1127 16:19:26.701570 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.193095547 podStartE2EDuration="24.7015454s" podCreationTimestamp="2025-11-27 16:19:02 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.653847457 +0000 UTC m=+921.285296225" lastFinishedPulling="2025-11-27 16:19:26.16229731 +0000 UTC m=+941.793746078" observedRunningTime="2025-11-27 16:19:26.694684153 +0000 UTC m=+942.326132921" watchObservedRunningTime="2025-11-27 16:19:26.7015454 +0000 UTC m=+942.332994168" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.204241 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73383544-93a5-426e-9d0a-f7626b876bb3" path="/var/lib/kubelet/pods/73383544-93a5-426e-9d0a-f7626b876bb3/volumes" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.233696 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.244835 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2z6f"] Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.312188 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-kdgq7"] Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.315281 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.330485 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kdgq7"] Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.413241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-config\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.413542 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqdcw\" (UniqueName: \"kubernetes.io/projected/7f3e04a7-5107-48e4-897a-8a126c0b2911-kube-api-access-xqdcw\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.413569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.413602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-dns-svc\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.413898 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.515001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-dns-svc\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.515121 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.515151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-config\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.515174 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqdcw\" (UniqueName: \"kubernetes.io/projected/7f3e04a7-5107-48e4-897a-8a126c0b2911-kube-api-access-xqdcw\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.515198 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.515942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-dns-svc\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.516035 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-config\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.516129 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.516448 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.542475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqdcw\" (UniqueName: \"kubernetes.io/projected/7f3e04a7-5107-48e4-897a-8a126c0b2911-kube-api-access-xqdcw\") pod \"dnsmasq-dns-698758b865-kdgq7\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.583566 4707 generic.go:334] "Generic (PLEG): container finished" podID="8782247a-1ddc-4e6e-9042-8c7ade00b0de" containerID="518e320d4e87cf89aae54333f75cd06a806c1c30c6423122a8977eb41312e75d" exitCode=0 Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.583621 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" event={"ID":"8782247a-1ddc-4e6e-9042-8c7ade00b0de","Type":"ContainerDied","Data":"518e320d4e87cf89aae54333f75cd06a806c1c30c6423122a8977eb41312e75d"} Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.584062 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" event={"ID":"8782247a-1ddc-4e6e-9042-8c7ade00b0de","Type":"ContainerStarted","Data":"55b08755188015a5196dac6130fc5161bad8c5b82847d463d19d6c7e50ca6dcb"} Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.585695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9br7v" event={"ID":"d951ce68-e4f2-4ead-aaef-b264f721d7a3","Type":"ContainerStarted","Data":"c1f3b0482a427a10386866eef7c98eed2f6f0302369675c44d2c15a511378eb9"} Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.585742 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9br7v" event={"ID":"d951ce68-e4f2-4ead-aaef-b264f721d7a3","Type":"ContainerStarted","Data":"d2e4a8d40b9c795d70e8d7c51c0d4ae6b640cf7576ddc805d0c35e45017388ef"} Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.622980 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9br7v" podStartSLOduration=4.62295689 podStartE2EDuration="4.62295689s" podCreationTimestamp="2025-11-27 16:19:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:19:27.616751329 +0000 UTC m=+943.248200097" watchObservedRunningTime="2025-11-27 16:19:27.62295689 +0000 UTC m=+943.254405658" Nov 27 16:19:27 crc kubenswrapper[4707]: I1127 16:19:27.644278 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:27 crc kubenswrapper[4707]: E1127 16:19:27.840514 4707 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 27 16:19:27 crc kubenswrapper[4707]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/8782247a-1ddc-4e6e-9042-8c7ade00b0de/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 27 16:19:27 crc kubenswrapper[4707]: > podSandboxID="55b08755188015a5196dac6130fc5161bad8c5b82847d463d19d6c7e50ca6dcb" Nov 27 16:19:27 crc kubenswrapper[4707]: E1127 16:19:27.840993 4707 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 27 16:19:27 crc kubenswrapper[4707]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xc7ww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-k2z6f_openstack(8782247a-1ddc-4e6e-9042-8c7ade00b0de): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/8782247a-1ddc-4e6e-9042-8c7ade00b0de/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 27 16:19:27 crc kubenswrapper[4707]: > logger="UnhandledError" Nov 27 16:19:27 crc kubenswrapper[4707]: E1127 16:19:27.842408 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/8782247a-1ddc-4e6e-9042-8c7ade00b0de/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" podUID="8782247a-1ddc-4e6e-9042-8c7ade00b0de" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.145796 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kdgq7"] Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.212433 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.296803 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.341107 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.349353 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.356411 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.356419 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.357423 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-j2xsd" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.357527 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.367142 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.407535 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.435579 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.435656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d67e130-f1e2-4fe8-9647-8725402a1cdd-cache\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.435699 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4j9\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-kube-api-access-vp4j9\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.435716 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d67e130-f1e2-4fe8-9647-8725402a1cdd-lock\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.435740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.450913 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.536356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d67e130-f1e2-4fe8-9647-8725402a1cdd-lock\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.536455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4j9\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-kube-api-access-vp4j9\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.536512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.536600 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.536711 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d67e130-f1e2-4fe8-9647-8725402a1cdd-cache\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: E1127 16:19:28.536774 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:19:28 crc kubenswrapper[4707]: E1127 16:19:28.536804 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:19:28 crc kubenswrapper[4707]: E1127 16:19:28.536877 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift podName:4d67e130-f1e2-4fe8-9647-8725402a1cdd nodeName:}" failed. No retries permitted until 2025-11-27 16:19:29.036854888 +0000 UTC m=+944.668303746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift") pod "swift-storage-0" (UID: "4d67e130-f1e2-4fe8-9647-8725402a1cdd") : configmap "swift-ring-files" not found Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.536925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d67e130-f1e2-4fe8-9647-8725402a1cdd-lock\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.537117 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.537309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d67e130-f1e2-4fe8-9647-8725402a1cdd-cache\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.556770 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4j9\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-kube-api-access-vp4j9\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.561643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.595833 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerID="6629abc42a630e4279c596f6faccb548cccb0fd2a0a73f19a940a1791e96b9c1" exitCode=0 Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.595890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kdgq7" event={"ID":"7f3e04a7-5107-48e4-897a-8a126c0b2911","Type":"ContainerDied","Data":"6629abc42a630e4279c596f6faccb548cccb0fd2a0a73f19a940a1791e96b9c1"} Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.595947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kdgq7" event={"ID":"7f3e04a7-5107-48e4-897a-8a126c0b2911","Type":"ContainerStarted","Data":"71c60cd83198d2a103ec7104b0220cc38f55260931a79f3ccd2b52d2edbcb24b"} Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.596431 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.596461 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.669987 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.687125 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 27 16:19:28 crc kubenswrapper[4707]: I1127 16:19:28.997290 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.049333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-sb\") pod \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.049440 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-dns-svc\") pod \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.049463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-nb\") pod \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.049564 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-config\") pod \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.049615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7ww\" (UniqueName: \"kubernetes.io/projected/8782247a-1ddc-4e6e-9042-8c7ade00b0de-kube-api-access-xc7ww\") pod \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\" (UID: \"8782247a-1ddc-4e6e-9042-8c7ade00b0de\") " Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.049893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:29 crc kubenswrapper[4707]: E1127 16:19:29.050083 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:19:29 crc kubenswrapper[4707]: E1127 16:19:29.050102 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:19:29 crc kubenswrapper[4707]: E1127 16:19:29.050151 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift podName:4d67e130-f1e2-4fe8-9647-8725402a1cdd nodeName:}" failed. No retries permitted until 2025-11-27 16:19:30.050136776 +0000 UTC m=+945.681585544 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift") pod "swift-storage-0" (UID: "4d67e130-f1e2-4fe8-9647-8725402a1cdd") : configmap "swift-ring-files" not found Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.056195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8782247a-1ddc-4e6e-9042-8c7ade00b0de-kube-api-access-xc7ww" (OuterVolumeSpecName: "kube-api-access-xc7ww") pod "8782247a-1ddc-4e6e-9042-8c7ade00b0de" (UID: "8782247a-1ddc-4e6e-9042-8c7ade00b0de"). InnerVolumeSpecName "kube-api-access-xc7ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.098677 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8782247a-1ddc-4e6e-9042-8c7ade00b0de" (UID: "8782247a-1ddc-4e6e-9042-8c7ade00b0de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.101788 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-config" (OuterVolumeSpecName: "config") pod "8782247a-1ddc-4e6e-9042-8c7ade00b0de" (UID: "8782247a-1ddc-4e6e-9042-8c7ade00b0de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.102704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8782247a-1ddc-4e6e-9042-8c7ade00b0de" (UID: "8782247a-1ddc-4e6e-9042-8c7ade00b0de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.108898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8782247a-1ddc-4e6e-9042-8c7ade00b0de" (UID: "8782247a-1ddc-4e6e-9042-8c7ade00b0de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.151869 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.151898 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.151907 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.151918 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8782247a-1ddc-4e6e-9042-8c7ade00b0de-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.151927 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7ww\" (UniqueName: \"kubernetes.io/projected/8782247a-1ddc-4e6e-9042-8c7ade00b0de-kube-api-access-xc7ww\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.241641 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 27 16:19:29 crc kubenswrapper[4707]: E1127 16:19:29.242223 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8782247a-1ddc-4e6e-9042-8c7ade00b0de" containerName="init" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.242239 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8782247a-1ddc-4e6e-9042-8c7ade00b0de" containerName="init" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.245085 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8782247a-1ddc-4e6e-9042-8c7ade00b0de" containerName="init" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.245970 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.249902 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.250968 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.251091 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4n6vt" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.251257 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.270833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.362831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.363068 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.363131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-config\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.363346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.363461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvbd9\" (UniqueName: \"kubernetes.io/projected/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-kube-api-access-fvbd9\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.363496 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.363805 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-scripts\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: E1127 16:19:29.403716 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8782247a_1ddc_4e6e_9042_8c7ade00b0de.slice/crio-55b08755188015a5196dac6130fc5161bad8c5b82847d463d19d6c7e50ca6dcb\": RecentStats: unable to find data in memory cache]" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.465877 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-config\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.465926 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.466380 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.466425 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvbd9\" (UniqueName: \"kubernetes.io/projected/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-kube-api-access-fvbd9\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.466463 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.467148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-scripts\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.467203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.467252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.467276 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-config\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.467894 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-scripts\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.471616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.482439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.483632 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvbd9\" (UniqueName: \"kubernetes.io/projected/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-kube-api-access-fvbd9\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.483859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2\") " pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.576786 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.605851 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kdgq7" event={"ID":"7f3e04a7-5107-48e4-897a-8a126c0b2911","Type":"ContainerStarted","Data":"d95331cb09f33591d1df0d76198fbd67a3b71fc251fe031ec1da54f833d0221b"} Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.606411 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.610036 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.610092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2z6f" event={"ID":"8782247a-1ddc-4e6e-9042-8c7ade00b0de","Type":"ContainerDied","Data":"55b08755188015a5196dac6130fc5161bad8c5b82847d463d19d6c7e50ca6dcb"} Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.610129 4707 scope.go:117] "RemoveContainer" containerID="518e320d4e87cf89aae54333f75cd06a806c1c30c6423122a8977eb41312e75d" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.628967 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-kdgq7" podStartSLOduration=2.628947417 podStartE2EDuration="2.628947417s" podCreationTimestamp="2025-11-27 16:19:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:19:29.624754065 +0000 UTC m=+945.256202833" watchObservedRunningTime="2025-11-27 16:19:29.628947417 +0000 UTC m=+945.260396185" Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.697415 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2z6f"] Nov 27 16:19:29 crc kubenswrapper[4707]: I1127 16:19:29.702501 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2z6f"] Nov 27 16:19:30 crc kubenswrapper[4707]: I1127 16:19:30.039899 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 27 16:19:30 crc kubenswrapper[4707]: W1127 16:19:30.051086 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf62ccfb_9c5b_4c2c_a4d7_5cb9add147f2.slice/crio-f277432f4cc56e1c969646788e12516af6ffeb4c78eef2027a6a2a738e4491d2 WatchSource:0}: Error finding container f277432f4cc56e1c969646788e12516af6ffeb4c78eef2027a6a2a738e4491d2: Status 404 returned error can't find the container with id f277432f4cc56e1c969646788e12516af6ffeb4c78eef2027a6a2a738e4491d2 Nov 27 16:19:30 crc kubenswrapper[4707]: I1127 16:19:30.086488 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:30 crc kubenswrapper[4707]: E1127 16:19:30.086675 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:19:30 crc kubenswrapper[4707]: E1127 16:19:30.086706 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:19:30 crc kubenswrapper[4707]: E1127 16:19:30.086783 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift podName:4d67e130-f1e2-4fe8-9647-8725402a1cdd nodeName:}" failed. No retries permitted until 2025-11-27 16:19:32.086759506 +0000 UTC m=+947.718208314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift") pod "swift-storage-0" (UID: "4d67e130-f1e2-4fe8-9647-8725402a1cdd") : configmap "swift-ring-files" not found Nov 27 16:19:30 crc kubenswrapper[4707]: I1127 16:19:30.619252 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2","Type":"ContainerStarted","Data":"f277432f4cc56e1c969646788e12516af6ffeb4c78eef2027a6a2a738e4491d2"} Nov 27 16:19:31 crc kubenswrapper[4707]: I1127 16:19:31.206096 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8782247a-1ddc-4e6e-9042-8c7ade00b0de" path="/var/lib/kubelet/pods/8782247a-1ddc-4e6e-9042-8c7ade00b0de/volumes" Nov 27 16:19:31 crc kubenswrapper[4707]: I1127 16:19:31.401665 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:31 crc kubenswrapper[4707]: I1127 16:19:31.401713 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:31 crc kubenswrapper[4707]: I1127 16:19:31.463196 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:31 crc kubenswrapper[4707]: I1127 16:19:31.629309 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2","Type":"ContainerStarted","Data":"1017d37ada9fe4c7c23aa01405c418b95ebc716c4ee27f12a8752d83f8a2d9f0"} Nov 27 16:19:31 crc kubenswrapper[4707]: I1127 16:19:31.710921 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:31 crc kubenswrapper[4707]: I1127 16:19:31.766386 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qcjdd"] Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.121141 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:32 crc kubenswrapper[4707]: E1127 16:19:32.121502 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:19:32 crc kubenswrapper[4707]: E1127 16:19:32.121543 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:19:32 crc kubenswrapper[4707]: E1127 16:19:32.121639 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift podName:4d67e130-f1e2-4fe8-9647-8725402a1cdd nodeName:}" failed. No retries permitted until 2025-11-27 16:19:36.121611975 +0000 UTC m=+951.753060773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift") pod "swift-storage-0" (UID: "4d67e130-f1e2-4fe8-9647-8725402a1cdd") : configmap "swift-ring-files" not found Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.329909 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vbngp"] Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.331128 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.333881 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.333984 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.334108 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.347804 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vbngp"] Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.426023 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-swiftconf\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.426310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-combined-ca-bundle\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.426364 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-ring-data-devices\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.426431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-dispersionconf\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.426641 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktfrq\" (UniqueName: \"kubernetes.io/projected/26d4145c-3144-4e1f-99ce-08d64f8b20be-kube-api-access-ktfrq\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.426705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-scripts\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.426738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26d4145c-3144-4e1f-99ce-08d64f8b20be-etc-swift\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.527786 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktfrq\" (UniqueName: \"kubernetes.io/projected/26d4145c-3144-4e1f-99ce-08d64f8b20be-kube-api-access-ktfrq\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.527844 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-scripts\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.527875 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26d4145c-3144-4e1f-99ce-08d64f8b20be-etc-swift\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.527929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-swiftconf\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.528015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-combined-ca-bundle\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.528036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-ring-data-devices\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.528059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-dispersionconf\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.528957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-ring-data-devices\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.529132 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-scripts\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.529410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26d4145c-3144-4e1f-99ce-08d64f8b20be-etc-swift\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.532362 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-dispersionconf\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.533905 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-combined-ca-bundle\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.534590 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-swiftconf\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.544748 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktfrq\" (UniqueName: \"kubernetes.io/projected/26d4145c-3144-4e1f-99ce-08d64f8b20be-kube-api-access-ktfrq\") pod \"swift-ring-rebalance-vbngp\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.638319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2","Type":"ContainerStarted","Data":"0677aa00b62195b52d08d9d64beef82bb000273b2b26166921a5063055e7eb48"} Nov 27 16:19:32 crc kubenswrapper[4707]: I1127 16:19:32.655492 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.149480 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.954480392 podStartE2EDuration="4.149465883s" podCreationTimestamp="2025-11-27 16:19:29 +0000 UTC" firstStartedPulling="2025-11-27 16:19:30.053834646 +0000 UTC m=+945.685283414" lastFinishedPulling="2025-11-27 16:19:31.248820137 +0000 UTC m=+946.880268905" observedRunningTime="2025-11-27 16:19:32.669184437 +0000 UTC m=+948.300633215" watchObservedRunningTime="2025-11-27 16:19:33.149465883 +0000 UTC m=+948.780914651" Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.152364 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vbngp"] Nov 27 16:19:33 crc kubenswrapper[4707]: W1127 16:19:33.166770 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d4145c_3144_4e1f_99ce_08d64f8b20be.slice/crio-962622b3b6a3211fdef3610dcb44ce8b04124f34a0a4967d59f358deb68b93c6 WatchSource:0}: Error finding container 962622b3b6a3211fdef3610dcb44ce8b04124f34a0a4967d59f358deb68b93c6: Status 404 returned error can't find the container with id 962622b3b6a3211fdef3610dcb44ce8b04124f34a0a4967d59f358deb68b93c6 Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.516321 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.516446 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.624154 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.624236 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.637100 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.649523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vbngp" event={"ID":"26d4145c-3144-4e1f-99ce-08d64f8b20be","Type":"ContainerStarted","Data":"962622b3b6a3211fdef3610dcb44ce8b04124f34a0a4967d59f358deb68b93c6"} Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.649645 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qcjdd" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerName="registry-server" containerID="cri-o://caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7" gracePeriod=2 Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.650157 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 27 16:19:33 crc kubenswrapper[4707]: I1127 16:19:33.782649 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.140927 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.259410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-catalog-content\") pod \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.259634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9dbs\" (UniqueName: \"kubernetes.io/projected/ce5bfb39-c7b0-4317-b53f-5f4280341da4-kube-api-access-c9dbs\") pod \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.259689 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-utilities\") pod \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\" (UID: \"ce5bfb39-c7b0-4317-b53f-5f4280341da4\") " Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.260541 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-utilities" (OuterVolumeSpecName: "utilities") pod "ce5bfb39-c7b0-4317-b53f-5f4280341da4" (UID: "ce5bfb39-c7b0-4317-b53f-5f4280341da4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.260847 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.290939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce5bfb39-c7b0-4317-b53f-5f4280341da4-kube-api-access-c9dbs" (OuterVolumeSpecName: "kube-api-access-c9dbs") pod "ce5bfb39-c7b0-4317-b53f-5f4280341da4" (UID: "ce5bfb39-c7b0-4317-b53f-5f4280341da4"). InnerVolumeSpecName "kube-api-access-c9dbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.312898 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce5bfb39-c7b0-4317-b53f-5f4280341da4" (UID: "ce5bfb39-c7b0-4317-b53f-5f4280341da4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.362504 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce5bfb39-c7b0-4317-b53f-5f4280341da4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.362557 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9dbs\" (UniqueName: \"kubernetes.io/projected/ce5bfb39-c7b0-4317-b53f-5f4280341da4-kube-api-access-c9dbs\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.659682 4707 generic.go:334] "Generic (PLEG): container finished" podID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerID="caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7" exitCode=0 Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.659787 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qcjdd" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.659790 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcjdd" event={"ID":"ce5bfb39-c7b0-4317-b53f-5f4280341da4","Type":"ContainerDied","Data":"caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7"} Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.659850 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qcjdd" event={"ID":"ce5bfb39-c7b0-4317-b53f-5f4280341da4","Type":"ContainerDied","Data":"77d0acfa0bcdf6a8c71031139cffecc45e5582836eabaa69b55a6615b2268740"} Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.659877 4707 scope.go:117] "RemoveContainer" containerID="caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.705466 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qcjdd"] Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.708982 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qcjdd"] Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.991763 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9dglp"] Nov 27 16:19:34 crc kubenswrapper[4707]: E1127 16:19:34.992076 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerName="extract-utilities" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.992093 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerName="extract-utilities" Nov 27 16:19:34 crc kubenswrapper[4707]: E1127 16:19:34.992108 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerName="registry-server" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.992116 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerName="registry-server" Nov 27 16:19:34 crc kubenswrapper[4707]: E1127 16:19:34.992125 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerName="extract-content" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.992134 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerName="extract-content" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.992284 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" containerName="registry-server" Nov 27 16:19:34 crc kubenswrapper[4707]: I1127 16:19:34.993202 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.000654 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9dglp"] Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.031034 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.031075 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.069126 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b279-account-create-update-ndbrb"] Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.070203 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.072639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.074712 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b279-account-create-update-ndbrb"] Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.075198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrcmp\" (UniqueName: \"kubernetes.io/projected/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-kube-api-access-hrcmp\") pod \"keystone-db-create-9dglp\" (UID: \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\") " pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.075403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-operator-scripts\") pod \"keystone-db-create-9dglp\" (UID: \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\") " pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.126661 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.166401 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-dr4hp"] Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.167506 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.176851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-operator-scripts\") pod \"keystone-db-create-9dglp\" (UID: \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\") " pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.176906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtcxm\" (UniqueName: \"kubernetes.io/projected/5844334a-ffad-42d6-a2cb-b714fe85f90f-kube-api-access-dtcxm\") pod \"keystone-b279-account-create-update-ndbrb\" (UID: \"5844334a-ffad-42d6-a2cb-b714fe85f90f\") " pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.176957 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5844334a-ffad-42d6-a2cb-b714fe85f90f-operator-scripts\") pod \"keystone-b279-account-create-update-ndbrb\" (UID: \"5844334a-ffad-42d6-a2cb-b714fe85f90f\") " pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.177006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrcmp\" (UniqueName: \"kubernetes.io/projected/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-kube-api-access-hrcmp\") pod \"keystone-db-create-9dglp\" (UID: \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\") " pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.177879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-operator-scripts\") pod \"keystone-db-create-9dglp\" (UID: \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\") " pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.182456 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dr4hp"] Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.193209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrcmp\" (UniqueName: \"kubernetes.io/projected/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-kube-api-access-hrcmp\") pod \"keystone-db-create-9dglp\" (UID: \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\") " pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.214210 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce5bfb39-c7b0-4317-b53f-5f4280341da4" path="/var/lib/kubelet/pods/ce5bfb39-c7b0-4317-b53f-5f4280341da4/volumes" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.265852 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8a5e-account-create-update-knzqk"] Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.266884 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.270715 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.277322 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8a5e-account-create-update-knzqk"] Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.278235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtcxm\" (UniqueName: \"kubernetes.io/projected/5844334a-ffad-42d6-a2cb-b714fe85f90f-kube-api-access-dtcxm\") pod \"keystone-b279-account-create-update-ndbrb\" (UID: \"5844334a-ffad-42d6-a2cb-b714fe85f90f\") " pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.278324 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5844334a-ffad-42d6-a2cb-b714fe85f90f-operator-scripts\") pod \"keystone-b279-account-create-update-ndbrb\" (UID: \"5844334a-ffad-42d6-a2cb-b714fe85f90f\") " pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.278414 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-operator-scripts\") pod \"placement-db-create-dr4hp\" (UID: \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\") " pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.278445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84nkg\" (UniqueName: \"kubernetes.io/projected/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-kube-api-access-84nkg\") pod \"placement-db-create-dr4hp\" (UID: \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\") " pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.279327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5844334a-ffad-42d6-a2cb-b714fe85f90f-operator-scripts\") pod \"keystone-b279-account-create-update-ndbrb\" (UID: \"5844334a-ffad-42d6-a2cb-b714fe85f90f\") " pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.296315 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtcxm\" (UniqueName: \"kubernetes.io/projected/5844334a-ffad-42d6-a2cb-b714fe85f90f-kube-api-access-dtcxm\") pod \"keystone-b279-account-create-update-ndbrb\" (UID: \"5844334a-ffad-42d6-a2cb-b714fe85f90f\") " pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.352087 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.380378 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dcab406-6c2d-497a-aad3-86162558c506-operator-scripts\") pod \"placement-8a5e-account-create-update-knzqk\" (UID: \"8dcab406-6c2d-497a-aad3-86162558c506\") " pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.380428 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-operator-scripts\") pod \"placement-db-create-dr4hp\" (UID: \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\") " pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.380453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84nkg\" (UniqueName: \"kubernetes.io/projected/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-kube-api-access-84nkg\") pod \"placement-db-create-dr4hp\" (UID: \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\") " pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.380514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnwws\" (UniqueName: \"kubernetes.io/projected/8dcab406-6c2d-497a-aad3-86162558c506-kube-api-access-fnwws\") pod \"placement-8a5e-account-create-update-knzqk\" (UID: \"8dcab406-6c2d-497a-aad3-86162558c506\") " pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.381102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-operator-scripts\") pod \"placement-db-create-dr4hp\" (UID: \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\") " pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.389227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.409399 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84nkg\" (UniqueName: \"kubernetes.io/projected/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-kube-api-access-84nkg\") pod \"placement-db-create-dr4hp\" (UID: \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\") " pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.482157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnwws\" (UniqueName: \"kubernetes.io/projected/8dcab406-6c2d-497a-aad3-86162558c506-kube-api-access-fnwws\") pod \"placement-8a5e-account-create-update-knzqk\" (UID: \"8dcab406-6c2d-497a-aad3-86162558c506\") " pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.482268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dcab406-6c2d-497a-aad3-86162558c506-operator-scripts\") pod \"placement-8a5e-account-create-update-knzqk\" (UID: \"8dcab406-6c2d-497a-aad3-86162558c506\") " pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.482937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dcab406-6c2d-497a-aad3-86162558c506-operator-scripts\") pod \"placement-8a5e-account-create-update-knzqk\" (UID: \"8dcab406-6c2d-497a-aad3-86162558c506\") " pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.486980 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.501913 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnwws\" (UniqueName: \"kubernetes.io/projected/8dcab406-6c2d-497a-aad3-86162558c506-kube-api-access-fnwws\") pod \"placement-8a5e-account-create-update-knzqk\" (UID: \"8dcab406-6c2d-497a-aad3-86162558c506\") " pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.593671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:35 crc kubenswrapper[4707]: I1127 16:19:35.735792 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 27 16:19:36 crc kubenswrapper[4707]: I1127 16:19:36.193036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:36 crc kubenswrapper[4707]: E1127 16:19:36.198387 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:19:36 crc kubenswrapper[4707]: E1127 16:19:36.198433 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:19:36 crc kubenswrapper[4707]: E1127 16:19:36.198497 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift podName:4d67e130-f1e2-4fe8-9647-8725402a1cdd nodeName:}" failed. No retries permitted until 2025-11-27 16:19:44.198472226 +0000 UTC m=+959.829920994 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift") pod "swift-storage-0" (UID: "4d67e130-f1e2-4fe8-9647-8725402a1cdd") : configmap "swift-ring-files" not found Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.626114 4707 scope.go:117] "RemoveContainer" containerID="85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13" Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.646595 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.789884 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hdvpv"] Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.790097 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" podUID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" containerName="dnsmasq-dns" containerID="cri-o://35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe" gracePeriod=10 Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.800595 4707 scope.go:117] "RemoveContainer" containerID="45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273" Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.944693 4707 scope.go:117] "RemoveContainer" containerID="caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7" Nov 27 16:19:37 crc kubenswrapper[4707]: E1127 16:19:37.948538 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7\": container with ID starting with caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7 not found: ID does not exist" containerID="caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7" Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.948577 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7"} err="failed to get container status \"caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7\": rpc error: code = NotFound desc = could not find container \"caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7\": container with ID starting with caba6a670a04a1ff1bbbf86d948f36139fcda3d106a948b6c775b9d512946fb7 not found: ID does not exist" Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.948601 4707 scope.go:117] "RemoveContainer" containerID="85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13" Nov 27 16:19:37 crc kubenswrapper[4707]: E1127 16:19:37.957460 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13\": container with ID starting with 85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13 not found: ID does not exist" containerID="85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13" Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.957498 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13"} err="failed to get container status \"85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13\": rpc error: code = NotFound desc = could not find container \"85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13\": container with ID starting with 85c29caaf061584ee62365b7df32e7330cdb7cfc8ca04c3ed00217cfab9c1d13 not found: ID does not exist" Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.957518 4707 scope.go:117] "RemoveContainer" containerID="45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273" Nov 27 16:19:37 crc kubenswrapper[4707]: E1127 16:19:37.961582 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273\": container with ID starting with 45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273 not found: ID does not exist" containerID="45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273" Nov 27 16:19:37 crc kubenswrapper[4707]: I1127 16:19:37.961622 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273"} err="failed to get container status \"45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273\": rpc error: code = NotFound desc = could not find container \"45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273\": container with ID starting with 45a0da5484dcb0ad2dbf79fee8866c95c479ba3d7066c3717cf381ef29a3e273 not found: ID does not exist" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.408356 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.448673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzmnh\" (UniqueName: \"kubernetes.io/projected/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-kube-api-access-zzmnh\") pod \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.448831 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-dns-svc\") pod \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.448968 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-config\") pod \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\" (UID: \"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f\") " Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.450104 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8a5e-account-create-update-knzqk"] Nov 27 16:19:38 crc kubenswrapper[4707]: W1127 16:19:38.452081 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dcab406_6c2d_497a_aad3_86162558c506.slice/crio-17450d1eff86fa28413fffd45c50e8b66d1610907cccf5edb033509c8ac4d9d3 WatchSource:0}: Error finding container 17450d1eff86fa28413fffd45c50e8b66d1610907cccf5edb033509c8ac4d9d3: Status 404 returned error can't find the container with id 17450d1eff86fa28413fffd45c50e8b66d1610907cccf5edb033509c8ac4d9d3 Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.456548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-kube-api-access-zzmnh" (OuterVolumeSpecName: "kube-api-access-zzmnh") pod "98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" (UID: "98f69e7a-a5b6-4358-9699-5a8f83ff8a5f"). InnerVolumeSpecName "kube-api-access-zzmnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.496932 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-config" (OuterVolumeSpecName: "config") pod "98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" (UID: "98f69e7a-a5b6-4358-9699-5a8f83ff8a5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.514338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" (UID: "98f69e7a-a5b6-4358-9699-5a8f83ff8a5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.549909 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.551144 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzmnh\" (UniqueName: \"kubernetes.io/projected/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-kube-api-access-zzmnh\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.551159 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.551527 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-dr4hp"] Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.560328 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9dglp"] Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.569924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b279-account-create-update-ndbrb"] Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.718447 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8a5e-account-create-update-knzqk" event={"ID":"8dcab406-6c2d-497a-aad3-86162558c506","Type":"ContainerStarted","Data":"ebbdaa5f9b26cac2e63eed7ff9d651633ee19f0ea2720c58ef299eba83859bd5"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.718486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8a5e-account-create-update-knzqk" event={"ID":"8dcab406-6c2d-497a-aad3-86162558c506","Type":"ContainerStarted","Data":"17450d1eff86fa28413fffd45c50e8b66d1610907cccf5edb033509c8ac4d9d3"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.720775 4707 generic.go:334] "Generic (PLEG): container finished" podID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" containerID="35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe" exitCode=0 Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.720818 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" event={"ID":"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f","Type":"ContainerDied","Data":"35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.720834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" event={"ID":"98f69e7a-a5b6-4358-9699-5a8f83ff8a5f","Type":"ContainerDied","Data":"92e0ca44752d223d3b01a36323dc4c4cdf5836fd6b0e354c503a2674fb7c3af9"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.720850 4707 scope.go:117] "RemoveContainer" containerID="35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.720923 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hdvpv" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.732822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9dglp" event={"ID":"4f5ca70a-f09d-4baf-a71e-21c7aeefb867","Type":"ContainerStarted","Data":"7f7e1053888c9270e40a5f69e0f5023906503e0c60ed7a7b58ae3cc7e7f935f2"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.732854 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9dglp" event={"ID":"4f5ca70a-f09d-4baf-a71e-21c7aeefb867","Type":"ContainerStarted","Data":"aa832471ae6b288595c90c677643e523be7e88ee3aa760a040c4e5f3df24192c"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.734311 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vbngp" event={"ID":"26d4145c-3144-4e1f-99ce-08d64f8b20be","Type":"ContainerStarted","Data":"de655b1cbf8d621b297ad01f2772c2fef583b458c6521d3d3ac16d2832fd70a5"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.740219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dr4hp" event={"ID":"252b2a8c-9a8b-4883-9ae7-27fee7760d5f","Type":"ContainerStarted","Data":"5e1de213dd3c7c73426120936e25287d7a184b8da2b003cb8f010b618bb86cd7"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.742213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b279-account-create-update-ndbrb" event={"ID":"5844334a-ffad-42d6-a2cb-b714fe85f90f","Type":"ContainerStarted","Data":"ba9a259536aae6798fd19ed2bf858fd052d228770158ec34f6f925586de70624"} Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.742463 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8a5e-account-create-update-knzqk" podStartSLOduration=3.7424430109999998 podStartE2EDuration="3.742443011s" podCreationTimestamp="2025-11-27 16:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:19:38.740683929 +0000 UTC m=+954.372132727" watchObservedRunningTime="2025-11-27 16:19:38.742443011 +0000 UTC m=+954.373891779" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.758354 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-dr4hp" podStartSLOduration=3.758331918 podStartE2EDuration="3.758331918s" podCreationTimestamp="2025-11-27 16:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:19:38.757323713 +0000 UTC m=+954.388772491" watchObservedRunningTime="2025-11-27 16:19:38.758331918 +0000 UTC m=+954.389780686" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.761231 4707 scope.go:117] "RemoveContainer" containerID="7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.776548 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-9dglp" podStartSLOduration=4.77653008 podStartE2EDuration="4.77653008s" podCreationTimestamp="2025-11-27 16:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:19:38.773032155 +0000 UTC m=+954.404480923" watchObservedRunningTime="2025-11-27 16:19:38.77653008 +0000 UTC m=+954.407978848" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.784502 4707 scope.go:117] "RemoveContainer" containerID="35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe" Nov 27 16:19:38 crc kubenswrapper[4707]: E1127 16:19:38.788458 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe\": container with ID starting with 35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe not found: ID does not exist" containerID="35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.788491 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe"} err="failed to get container status \"35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe\": rpc error: code = NotFound desc = could not find container \"35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe\": container with ID starting with 35266a9739204f8671d3b4f0d14938bb35ca21739485bf61dc157dd96a74b4fe not found: ID does not exist" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.788512 4707 scope.go:117] "RemoveContainer" containerID="7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6" Nov 27 16:19:38 crc kubenswrapper[4707]: E1127 16:19:38.788839 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6\": container with ID starting with 7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6 not found: ID does not exist" containerID="7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.788861 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6"} err="failed to get container status \"7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6\": rpc error: code = NotFound desc = could not find container \"7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6\": container with ID starting with 7211bdbdadd3cf400172dabfa829ba406fdfc94d367857749a284fe7f21b10a6 not found: ID does not exist" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.796421 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vbngp" podStartSLOduration=2.123229287 podStartE2EDuration="6.796402843s" podCreationTimestamp="2025-11-27 16:19:32 +0000 UTC" firstStartedPulling="2025-11-27 16:19:33.170204368 +0000 UTC m=+948.801653136" lastFinishedPulling="2025-11-27 16:19:37.843377924 +0000 UTC m=+953.474826692" observedRunningTime="2025-11-27 16:19:38.792289503 +0000 UTC m=+954.423738271" watchObservedRunningTime="2025-11-27 16:19:38.796402843 +0000 UTC m=+954.427851611" Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.815987 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hdvpv"] Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.821546 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hdvpv"] Nov 27 16:19:38 crc kubenswrapper[4707]: I1127 16:19:38.857434 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b279-account-create-update-ndbrb" podStartSLOduration=3.857414006 podStartE2EDuration="3.857414006s" podCreationTimestamp="2025-11-27 16:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:19:38.834388607 +0000 UTC m=+954.465837365" watchObservedRunningTime="2025-11-27 16:19:38.857414006 +0000 UTC m=+954.488862774" Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.209897 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" path="/var/lib/kubelet/pods/98f69e7a-a5b6-4358-9699-5a8f83ff8a5f/volumes" Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.761805 4707 generic.go:334] "Generic (PLEG): container finished" podID="8dcab406-6c2d-497a-aad3-86162558c506" containerID="ebbdaa5f9b26cac2e63eed7ff9d651633ee19f0ea2720c58ef299eba83859bd5" exitCode=0 Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.762242 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8a5e-account-create-update-knzqk" event={"ID":"8dcab406-6c2d-497a-aad3-86162558c506","Type":"ContainerDied","Data":"ebbdaa5f9b26cac2e63eed7ff9d651633ee19f0ea2720c58ef299eba83859bd5"} Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.766311 4707 generic.go:334] "Generic (PLEG): container finished" podID="4f5ca70a-f09d-4baf-a71e-21c7aeefb867" containerID="7f7e1053888c9270e40a5f69e0f5023906503e0c60ed7a7b58ae3cc7e7f935f2" exitCode=0 Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.766450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9dglp" event={"ID":"4f5ca70a-f09d-4baf-a71e-21c7aeefb867","Type":"ContainerDied","Data":"7f7e1053888c9270e40a5f69e0f5023906503e0c60ed7a7b58ae3cc7e7f935f2"} Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.770611 4707 generic.go:334] "Generic (PLEG): container finished" podID="252b2a8c-9a8b-4883-9ae7-27fee7760d5f" containerID="3579e385d9a10937a03ea4bc515cf6b23a2e7a9f2485752294c492450ea8dd58" exitCode=0 Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.770676 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dr4hp" event={"ID":"252b2a8c-9a8b-4883-9ae7-27fee7760d5f","Type":"ContainerDied","Data":"3579e385d9a10937a03ea4bc515cf6b23a2e7a9f2485752294c492450ea8dd58"} Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.772330 4707 generic.go:334] "Generic (PLEG): container finished" podID="5844334a-ffad-42d6-a2cb-b714fe85f90f" containerID="c4c452ab3786d1b744f5004cfd9c13fa3ab7ef5a04611203494ecd7d442544e1" exitCode=0 Nov 27 16:19:39 crc kubenswrapper[4707]: I1127 16:19:39.772401 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b279-account-create-update-ndbrb" event={"ID":"5844334a-ffad-42d6-a2cb-b714fe85f90f","Type":"ContainerDied","Data":"c4c452ab3786d1b744f5004cfd9c13fa3ab7ef5a04611203494ecd7d442544e1"} Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.449111 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4g7sb"] Nov 27 16:19:40 crc kubenswrapper[4707]: E1127 16:19:40.449456 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" containerName="init" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.449468 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" containerName="init" Nov 27 16:19:40 crc kubenswrapper[4707]: E1127 16:19:40.449482 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" containerName="dnsmasq-dns" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.449488 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" containerName="dnsmasq-dns" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.449668 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f69e7a-a5b6-4358-9699-5a8f83ff8a5f" containerName="dnsmasq-dns" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.450134 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.461046 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4g7sb"] Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.490489 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtpp4\" (UniqueName: \"kubernetes.io/projected/62dcf83c-a8a2-4362-b412-8243d19bd711-kube-api-access-jtpp4\") pod \"glance-db-create-4g7sb\" (UID: \"62dcf83c-a8a2-4362-b412-8243d19bd711\") " pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.490598 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62dcf83c-a8a2-4362-b412-8243d19bd711-operator-scripts\") pod \"glance-db-create-4g7sb\" (UID: \"62dcf83c-a8a2-4362-b412-8243d19bd711\") " pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.544066 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1a8e-account-create-update-qhr8n"] Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.545196 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.547940 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.556567 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1a8e-account-create-update-qhr8n"] Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.591784 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62d0aaa3-bc36-4408-ae01-14d362740486-operator-scripts\") pod \"glance-1a8e-account-create-update-qhr8n\" (UID: \"62d0aaa3-bc36-4408-ae01-14d362740486\") " pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.591842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62dcf83c-a8a2-4362-b412-8243d19bd711-operator-scripts\") pod \"glance-db-create-4g7sb\" (UID: \"62dcf83c-a8a2-4362-b412-8243d19bd711\") " pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.591922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2588\" (UniqueName: \"kubernetes.io/projected/62d0aaa3-bc36-4408-ae01-14d362740486-kube-api-access-g2588\") pod \"glance-1a8e-account-create-update-qhr8n\" (UID: \"62d0aaa3-bc36-4408-ae01-14d362740486\") " pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.591970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtpp4\" (UniqueName: \"kubernetes.io/projected/62dcf83c-a8a2-4362-b412-8243d19bd711-kube-api-access-jtpp4\") pod \"glance-db-create-4g7sb\" (UID: \"62dcf83c-a8a2-4362-b412-8243d19bd711\") " pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.592701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62dcf83c-a8a2-4362-b412-8243d19bd711-operator-scripts\") pod \"glance-db-create-4g7sb\" (UID: \"62dcf83c-a8a2-4362-b412-8243d19bd711\") " pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.614521 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtpp4\" (UniqueName: \"kubernetes.io/projected/62dcf83c-a8a2-4362-b412-8243d19bd711-kube-api-access-jtpp4\") pod \"glance-db-create-4g7sb\" (UID: \"62dcf83c-a8a2-4362-b412-8243d19bd711\") " pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.693182 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2588\" (UniqueName: \"kubernetes.io/projected/62d0aaa3-bc36-4408-ae01-14d362740486-kube-api-access-g2588\") pod \"glance-1a8e-account-create-update-qhr8n\" (UID: \"62d0aaa3-bc36-4408-ae01-14d362740486\") " pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.693312 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62d0aaa3-bc36-4408-ae01-14d362740486-operator-scripts\") pod \"glance-1a8e-account-create-update-qhr8n\" (UID: \"62d0aaa3-bc36-4408-ae01-14d362740486\") " pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.694188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62d0aaa3-bc36-4408-ae01-14d362740486-operator-scripts\") pod \"glance-1a8e-account-create-update-qhr8n\" (UID: \"62d0aaa3-bc36-4408-ae01-14d362740486\") " pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.716510 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2588\" (UniqueName: \"kubernetes.io/projected/62d0aaa3-bc36-4408-ae01-14d362740486-kube-api-access-g2588\") pod \"glance-1a8e-account-create-update-qhr8n\" (UID: \"62d0aaa3-bc36-4408-ae01-14d362740486\") " pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.770681 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:40 crc kubenswrapper[4707]: I1127 16:19:40.898402 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.275440 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.309025 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnwws\" (UniqueName: \"kubernetes.io/projected/8dcab406-6c2d-497a-aad3-86162558c506-kube-api-access-fnwws\") pod \"8dcab406-6c2d-497a-aad3-86162558c506\" (UID: \"8dcab406-6c2d-497a-aad3-86162558c506\") " Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.309095 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dcab406-6c2d-497a-aad3-86162558c506-operator-scripts\") pod \"8dcab406-6c2d-497a-aad3-86162558c506\" (UID: \"8dcab406-6c2d-497a-aad3-86162558c506\") " Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.310342 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dcab406-6c2d-497a-aad3-86162558c506-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8dcab406-6c2d-497a-aad3-86162558c506" (UID: "8dcab406-6c2d-497a-aad3-86162558c506"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.321631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dcab406-6c2d-497a-aad3-86162558c506-kube-api-access-fnwws" (OuterVolumeSpecName: "kube-api-access-fnwws") pod "8dcab406-6c2d-497a-aad3-86162558c506" (UID: "8dcab406-6c2d-497a-aad3-86162558c506"). InnerVolumeSpecName "kube-api-access-fnwws". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.410765 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnwws\" (UniqueName: \"kubernetes.io/projected/8dcab406-6c2d-497a-aad3-86162558c506-kube-api-access-fnwws\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.411029 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dcab406-6c2d-497a-aad3-86162558c506-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.437236 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.442025 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.454154 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.511804 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-operator-scripts\") pod \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\" (UID: \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\") " Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.512152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5844334a-ffad-42d6-a2cb-b714fe85f90f-operator-scripts\") pod \"5844334a-ffad-42d6-a2cb-b714fe85f90f\" (UID: \"5844334a-ffad-42d6-a2cb-b714fe85f90f\") " Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.512246 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrcmp\" (UniqueName: \"kubernetes.io/projected/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-kube-api-access-hrcmp\") pod \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\" (UID: \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\") " Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.513646 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtcxm\" (UniqueName: \"kubernetes.io/projected/5844334a-ffad-42d6-a2cb-b714fe85f90f-kube-api-access-dtcxm\") pod \"5844334a-ffad-42d6-a2cb-b714fe85f90f\" (UID: \"5844334a-ffad-42d6-a2cb-b714fe85f90f\") " Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.513749 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-operator-scripts\") pod \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\" (UID: \"4f5ca70a-f09d-4baf-a71e-21c7aeefb867\") " Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.513867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84nkg\" (UniqueName: \"kubernetes.io/projected/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-kube-api-access-84nkg\") pod \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\" (UID: \"252b2a8c-9a8b-4883-9ae7-27fee7760d5f\") " Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.516103 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "252b2a8c-9a8b-4883-9ae7-27fee7760d5f" (UID: "252b2a8c-9a8b-4883-9ae7-27fee7760d5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.516450 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f5ca70a-f09d-4baf-a71e-21c7aeefb867" (UID: "4f5ca70a-f09d-4baf-a71e-21c7aeefb867"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.516539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5844334a-ffad-42d6-a2cb-b714fe85f90f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5844334a-ffad-42d6-a2cb-b714fe85f90f" (UID: "5844334a-ffad-42d6-a2cb-b714fe85f90f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.518196 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-kube-api-access-hrcmp" (OuterVolumeSpecName: "kube-api-access-hrcmp") pod "4f5ca70a-f09d-4baf-a71e-21c7aeefb867" (UID: "4f5ca70a-f09d-4baf-a71e-21c7aeefb867"). InnerVolumeSpecName "kube-api-access-hrcmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.518700 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-kube-api-access-84nkg" (OuterVolumeSpecName: "kube-api-access-84nkg") pod "252b2a8c-9a8b-4883-9ae7-27fee7760d5f" (UID: "252b2a8c-9a8b-4883-9ae7-27fee7760d5f"). InnerVolumeSpecName "kube-api-access-84nkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.520205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5844334a-ffad-42d6-a2cb-b714fe85f90f-kube-api-access-dtcxm" (OuterVolumeSpecName: "kube-api-access-dtcxm") pod "5844334a-ffad-42d6-a2cb-b714fe85f90f" (UID: "5844334a-ffad-42d6-a2cb-b714fe85f90f"). InnerVolumeSpecName "kube-api-access-dtcxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.527059 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4g7sb"] Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.616346 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.616399 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5844334a-ffad-42d6-a2cb-b714fe85f90f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.616410 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrcmp\" (UniqueName: \"kubernetes.io/projected/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-kube-api-access-hrcmp\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.616421 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtcxm\" (UniqueName: \"kubernetes.io/projected/5844334a-ffad-42d6-a2cb-b714fe85f90f-kube-api-access-dtcxm\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.616432 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f5ca70a-f09d-4baf-a71e-21c7aeefb867-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.616441 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84nkg\" (UniqueName: \"kubernetes.io/projected/252b2a8c-9a8b-4883-9ae7-27fee7760d5f-kube-api-access-84nkg\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.696440 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1a8e-account-create-update-qhr8n"] Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.786256 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a8e-account-create-update-qhr8n" event={"ID":"62d0aaa3-bc36-4408-ae01-14d362740486","Type":"ContainerStarted","Data":"33ecd12d150246c9b85ad78ac4e283e01861953cb6655c905984bc5e1842c10f"} Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.787718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4g7sb" event={"ID":"62dcf83c-a8a2-4362-b412-8243d19bd711","Type":"ContainerStarted","Data":"507c61a4ac7e9bb2559135e7ef3b743a40b7c972c7d32f87c7cd7db9c9a5d70b"} Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.787737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4g7sb" event={"ID":"62dcf83c-a8a2-4362-b412-8243d19bd711","Type":"ContainerStarted","Data":"d8d9c49c3f20dfbed2b083869fb255f0bd86058223726799d4833a4d8f74c256"} Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.789830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-dr4hp" event={"ID":"252b2a8c-9a8b-4883-9ae7-27fee7760d5f","Type":"ContainerDied","Data":"5e1de213dd3c7c73426120936e25287d7a184b8da2b003cb8f010b618bb86cd7"} Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.789848 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e1de213dd3c7c73426120936e25287d7a184b8da2b003cb8f010b618bb86cd7" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.789901 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-dr4hp" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.792988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b279-account-create-update-ndbrb" event={"ID":"5844334a-ffad-42d6-a2cb-b714fe85f90f","Type":"ContainerDied","Data":"ba9a259536aae6798fd19ed2bf858fd052d228770158ec34f6f925586de70624"} Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.793030 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba9a259536aae6798fd19ed2bf858fd052d228770158ec34f6f925586de70624" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.793117 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b279-account-create-update-ndbrb" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.803888 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-4g7sb" podStartSLOduration=1.803869296 podStartE2EDuration="1.803869296s" podCreationTimestamp="2025-11-27 16:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:19:41.802221686 +0000 UTC m=+957.433670464" watchObservedRunningTime="2025-11-27 16:19:41.803869296 +0000 UTC m=+957.435318064" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.805011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8a5e-account-create-update-knzqk" event={"ID":"8dcab406-6c2d-497a-aad3-86162558c506","Type":"ContainerDied","Data":"17450d1eff86fa28413fffd45c50e8b66d1610907cccf5edb033509c8ac4d9d3"} Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.805044 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17450d1eff86fa28413fffd45c50e8b66d1610907cccf5edb033509c8ac4d9d3" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.805113 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8a5e-account-create-update-knzqk" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.808095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9dglp" event={"ID":"4f5ca70a-f09d-4baf-a71e-21c7aeefb867","Type":"ContainerDied","Data":"aa832471ae6b288595c90c677643e523be7e88ee3aa760a040c4e5f3df24192c"} Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.808126 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa832471ae6b288595c90c677643e523be7e88ee3aa760a040c4e5f3df24192c" Nov 27 16:19:41 crc kubenswrapper[4707]: I1127 16:19:41.808187 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9dglp" Nov 27 16:19:42 crc kubenswrapper[4707]: I1127 16:19:42.817435 4707 generic.go:334] "Generic (PLEG): container finished" podID="62dcf83c-a8a2-4362-b412-8243d19bd711" containerID="507c61a4ac7e9bb2559135e7ef3b743a40b7c972c7d32f87c7cd7db9c9a5d70b" exitCode=0 Nov 27 16:19:42 crc kubenswrapper[4707]: I1127 16:19:42.817620 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4g7sb" event={"ID":"62dcf83c-a8a2-4362-b412-8243d19bd711","Type":"ContainerDied","Data":"507c61a4ac7e9bb2559135e7ef3b743a40b7c972c7d32f87c7cd7db9c9a5d70b"} Nov 27 16:19:42 crc kubenswrapper[4707]: I1127 16:19:42.818808 4707 generic.go:334] "Generic (PLEG): container finished" podID="62d0aaa3-bc36-4408-ae01-14d362740486" containerID="01234a10d3b06021338fa24e1eb201a928dace72113bfba0d7e0f34ba6abcb40" exitCode=0 Nov 27 16:19:42 crc kubenswrapper[4707]: I1127 16:19:42.818848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a8e-account-create-update-qhr8n" event={"ID":"62d0aaa3-bc36-4408-ae01-14d362740486","Type":"ContainerDied","Data":"01234a10d3b06021338fa24e1eb201a928dace72113bfba0d7e0f34ba6abcb40"} Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.261194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:19:44 crc kubenswrapper[4707]: E1127 16:19:44.261385 4707 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 16:19:44 crc kubenswrapper[4707]: E1127 16:19:44.261612 4707 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 16:19:44 crc kubenswrapper[4707]: E1127 16:19:44.261668 4707 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift podName:4d67e130-f1e2-4fe8-9647-8725402a1cdd nodeName:}" failed. No retries permitted until 2025-11-27 16:20:00.261649706 +0000 UTC m=+975.893098474 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift") pod "swift-storage-0" (UID: "4d67e130-f1e2-4fe8-9647-8725402a1cdd") : configmap "swift-ring-files" not found Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.355005 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.365035 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.463615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2588\" (UniqueName: \"kubernetes.io/projected/62d0aaa3-bc36-4408-ae01-14d362740486-kube-api-access-g2588\") pod \"62d0aaa3-bc36-4408-ae01-14d362740486\" (UID: \"62d0aaa3-bc36-4408-ae01-14d362740486\") " Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.463724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62d0aaa3-bc36-4408-ae01-14d362740486-operator-scripts\") pod \"62d0aaa3-bc36-4408-ae01-14d362740486\" (UID: \"62d0aaa3-bc36-4408-ae01-14d362740486\") " Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.464806 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d0aaa3-bc36-4408-ae01-14d362740486-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62d0aaa3-bc36-4408-ae01-14d362740486" (UID: "62d0aaa3-bc36-4408-ae01-14d362740486"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.468399 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d0aaa3-bc36-4408-ae01-14d362740486-kube-api-access-g2588" (OuterVolumeSpecName: "kube-api-access-g2588") pod "62d0aaa3-bc36-4408-ae01-14d362740486" (UID: "62d0aaa3-bc36-4408-ae01-14d362740486"). InnerVolumeSpecName "kube-api-access-g2588". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.565743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtpp4\" (UniqueName: \"kubernetes.io/projected/62dcf83c-a8a2-4362-b412-8243d19bd711-kube-api-access-jtpp4\") pod \"62dcf83c-a8a2-4362-b412-8243d19bd711\" (UID: \"62dcf83c-a8a2-4362-b412-8243d19bd711\") " Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.565990 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62dcf83c-a8a2-4362-b412-8243d19bd711-operator-scripts\") pod \"62dcf83c-a8a2-4362-b412-8243d19bd711\" (UID: \"62dcf83c-a8a2-4362-b412-8243d19bd711\") " Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.566799 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62dcf83c-a8a2-4362-b412-8243d19bd711-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62dcf83c-a8a2-4362-b412-8243d19bd711" (UID: "62dcf83c-a8a2-4362-b412-8243d19bd711"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.566957 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2588\" (UniqueName: \"kubernetes.io/projected/62d0aaa3-bc36-4408-ae01-14d362740486-kube-api-access-g2588\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.566990 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62d0aaa3-bc36-4408-ae01-14d362740486-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.570577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62dcf83c-a8a2-4362-b412-8243d19bd711-kube-api-access-jtpp4" (OuterVolumeSpecName: "kube-api-access-jtpp4") pod "62dcf83c-a8a2-4362-b412-8243d19bd711" (UID: "62dcf83c-a8a2-4362-b412-8243d19bd711"). InnerVolumeSpecName "kube-api-access-jtpp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.641041 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.670190 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtpp4\" (UniqueName: \"kubernetes.io/projected/62dcf83c-a8a2-4362-b412-8243d19bd711-kube-api-access-jtpp4\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.670228 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62dcf83c-a8a2-4362-b412-8243d19bd711-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.840885 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a8e-account-create-update-qhr8n" event={"ID":"62d0aaa3-bc36-4408-ae01-14d362740486","Type":"ContainerDied","Data":"33ecd12d150246c9b85ad78ac4e283e01861953cb6655c905984bc5e1842c10f"} Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.840959 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ecd12d150246c9b85ad78ac4e283e01861953cb6655c905984bc5e1842c10f" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.840912 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a8e-account-create-update-qhr8n" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.846849 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4g7sb" event={"ID":"62dcf83c-a8a2-4362-b412-8243d19bd711","Type":"ContainerDied","Data":"d8d9c49c3f20dfbed2b083869fb255f0bd86058223726799d4833a4d8f74c256"} Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.846906 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d9c49c3f20dfbed2b083869fb255f0bd86058223726799d4833a4d8f74c256" Nov 27 16:19:44 crc kubenswrapper[4707]: I1127 16:19:44.846922 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4g7sb" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.776285 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-kdv4k"] Nov 27 16:19:45 crc kubenswrapper[4707]: E1127 16:19:45.776752 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dcab406-6c2d-497a-aad3-86162558c506" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.776775 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcab406-6c2d-497a-aad3-86162558c506" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: E1127 16:19:45.776784 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5844334a-ffad-42d6-a2cb-b714fe85f90f" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.776792 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5844334a-ffad-42d6-a2cb-b714fe85f90f" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: E1127 16:19:45.776811 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5ca70a-f09d-4baf-a71e-21c7aeefb867" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.776820 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5ca70a-f09d-4baf-a71e-21c7aeefb867" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: E1127 16:19:45.776841 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dcf83c-a8a2-4362-b412-8243d19bd711" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.776849 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dcf83c-a8a2-4362-b412-8243d19bd711" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: E1127 16:19:45.776869 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252b2a8c-9a8b-4883-9ae7-27fee7760d5f" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.776877 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="252b2a8c-9a8b-4883-9ae7-27fee7760d5f" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: E1127 16:19:45.776889 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d0aaa3-bc36-4408-ae01-14d362740486" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.776896 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d0aaa3-bc36-4408-ae01-14d362740486" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.777108 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dcab406-6c2d-497a-aad3-86162558c506" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.777127 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d0aaa3-bc36-4408-ae01-14d362740486" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.777144 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5ca70a-f09d-4baf-a71e-21c7aeefb867" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.777159 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="252b2a8c-9a8b-4883-9ae7-27fee7760d5f" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.777170 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="62dcf83c-a8a2-4362-b412-8243d19bd711" containerName="mariadb-database-create" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.777183 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5844334a-ffad-42d6-a2cb-b714fe85f90f" containerName="mariadb-account-create-update" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.777753 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.781749 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.783440 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wtmvr" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.795892 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kdv4k"] Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.828011 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzznc\" (UniqueName: \"kubernetes.io/projected/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-kube-api-access-nzznc\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.828055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-config-data\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.828085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-combined-ca-bundle\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.828137 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-db-sync-config-data\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.856780 4707 generic.go:334] "Generic (PLEG): container finished" podID="26d4145c-3144-4e1f-99ce-08d64f8b20be" containerID="de655b1cbf8d621b297ad01f2772c2fef583b458c6521d3d3ac16d2832fd70a5" exitCode=0 Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.856834 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vbngp" event={"ID":"26d4145c-3144-4e1f-99ce-08d64f8b20be","Type":"ContainerDied","Data":"de655b1cbf8d621b297ad01f2772c2fef583b458c6521d3d3ac16d2832fd70a5"} Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.929753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzznc\" (UniqueName: \"kubernetes.io/projected/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-kube-api-access-nzznc\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.930119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-config-data\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.930158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-combined-ca-bundle\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.930230 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-db-sync-config-data\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.933272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-config-data\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.935991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-db-sync-config-data\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.937707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-combined-ca-bundle\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:45 crc kubenswrapper[4707]: I1127 16:19:45.949205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzznc\" (UniqueName: \"kubernetes.io/projected/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-kube-api-access-nzznc\") pod \"glance-db-sync-kdv4k\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.096476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kdv4k" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.241709 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-57z78"] Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.243419 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.257806 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-57z78"] Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.345352 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qdm4\" (UniqueName: \"kubernetes.io/projected/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-kube-api-access-4qdm4\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.345431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-utilities\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.345623 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-catalog-content\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.446864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-catalog-content\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.446937 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qdm4\" (UniqueName: \"kubernetes.io/projected/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-kube-api-access-4qdm4\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.446970 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-utilities\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.447334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-catalog-content\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.447334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-utilities\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.479283 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qdm4\" (UniqueName: \"kubernetes.io/projected/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-kube-api-access-4qdm4\") pod \"redhat-marketplace-57z78\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.571657 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.727839 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kdv4k"] Nov 27 16:19:46 crc kubenswrapper[4707]: W1127 16:19:46.734742 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09fb5cc3_1c55_459c_aa89_63f13e8f97f4.slice/crio-a90b31025fffaab4aef0e5edd894d0033035c4ce7bf7c5cfea449e8bf94d16a3 WatchSource:0}: Error finding container a90b31025fffaab4aef0e5edd894d0033035c4ce7bf7c5cfea449e8bf94d16a3: Status 404 returned error can't find the container with id a90b31025fffaab4aef0e5edd894d0033035c4ce7bf7c5cfea449e8bf94d16a3 Nov 27 16:19:46 crc kubenswrapper[4707]: I1127 16:19:46.866336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kdv4k" event={"ID":"09fb5cc3-1c55-459c-aa89-63f13e8f97f4","Type":"ContainerStarted","Data":"a90b31025fffaab4aef0e5edd894d0033035c4ce7bf7c5cfea449e8bf94d16a3"} Nov 27 16:19:47 crc kubenswrapper[4707]: W1127 16:19:47.073644 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83c9f54d_8957_42f2_9b0a_24b4cc72a2a7.slice/crio-43991da9f7d31b02b9c057e3ef1d769990c6b3dfe3b2f8278bd03293d68fc6ea WatchSource:0}: Error finding container 43991da9f7d31b02b9c057e3ef1d769990c6b3dfe3b2f8278bd03293d68fc6ea: Status 404 returned error can't find the container with id 43991da9f7d31b02b9c057e3ef1d769990c6b3dfe3b2f8278bd03293d68fc6ea Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.093281 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-57z78"] Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.265859 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.390209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-ring-data-devices\") pod \"26d4145c-3144-4e1f-99ce-08d64f8b20be\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.390743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-combined-ca-bundle\") pod \"26d4145c-3144-4e1f-99ce-08d64f8b20be\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.390965 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-scripts\") pod \"26d4145c-3144-4e1f-99ce-08d64f8b20be\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.391083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26d4145c-3144-4e1f-99ce-08d64f8b20be-etc-swift\") pod \"26d4145c-3144-4e1f-99ce-08d64f8b20be\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.391213 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-dispersionconf\") pod \"26d4145c-3144-4e1f-99ce-08d64f8b20be\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.391342 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-swiftconf\") pod \"26d4145c-3144-4e1f-99ce-08d64f8b20be\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.391451 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktfrq\" (UniqueName: \"kubernetes.io/projected/26d4145c-3144-4e1f-99ce-08d64f8b20be-kube-api-access-ktfrq\") pod \"26d4145c-3144-4e1f-99ce-08d64f8b20be\" (UID: \"26d4145c-3144-4e1f-99ce-08d64f8b20be\") " Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.391008 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "26d4145c-3144-4e1f-99ce-08d64f8b20be" (UID: "26d4145c-3144-4e1f-99ce-08d64f8b20be"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.392907 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d4145c-3144-4e1f-99ce-08d64f8b20be-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "26d4145c-3144-4e1f-99ce-08d64f8b20be" (UID: "26d4145c-3144-4e1f-99ce-08d64f8b20be"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.397820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d4145c-3144-4e1f-99ce-08d64f8b20be-kube-api-access-ktfrq" (OuterVolumeSpecName: "kube-api-access-ktfrq") pod "26d4145c-3144-4e1f-99ce-08d64f8b20be" (UID: "26d4145c-3144-4e1f-99ce-08d64f8b20be"). InnerVolumeSpecName "kube-api-access-ktfrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.398608 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "26d4145c-3144-4e1f-99ce-08d64f8b20be" (UID: "26d4145c-3144-4e1f-99ce-08d64f8b20be"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.417393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26d4145c-3144-4e1f-99ce-08d64f8b20be" (UID: "26d4145c-3144-4e1f-99ce-08d64f8b20be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.417778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "26d4145c-3144-4e1f-99ce-08d64f8b20be" (UID: "26d4145c-3144-4e1f-99ce-08d64f8b20be"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.426225 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-scripts" (OuterVolumeSpecName: "scripts") pod "26d4145c-3144-4e1f-99ce-08d64f8b20be" (UID: "26d4145c-3144-4e1f-99ce-08d64f8b20be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.494969 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.495043 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.495053 4707 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/26d4145c-3144-4e1f-99ce-08d64f8b20be-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.495061 4707 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.495070 4707 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/26d4145c-3144-4e1f-99ce-08d64f8b20be-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.495079 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktfrq\" (UniqueName: \"kubernetes.io/projected/26d4145c-3144-4e1f-99ce-08d64f8b20be-kube-api-access-ktfrq\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.495117 4707 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/26d4145c-3144-4e1f-99ce-08d64f8b20be-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.876112 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vbngp" event={"ID":"26d4145c-3144-4e1f-99ce-08d64f8b20be","Type":"ContainerDied","Data":"962622b3b6a3211fdef3610dcb44ce8b04124f34a0a4967d59f358deb68b93c6"} Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.876172 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="962622b3b6a3211fdef3610dcb44ce8b04124f34a0a4967d59f358deb68b93c6" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.876246 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vbngp" Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.883465 4707 generic.go:334] "Generic (PLEG): container finished" podID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerID="29c2ffd9a15e55284540a9d1c0e190a93ba2cece0d7c499b437ae711000839bc" exitCode=0 Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.883595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-57z78" event={"ID":"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7","Type":"ContainerDied","Data":"29c2ffd9a15e55284540a9d1c0e190a93ba2cece0d7c499b437ae711000839bc"} Nov 27 16:19:47 crc kubenswrapper[4707]: I1127 16:19:47.883852 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-57z78" event={"ID":"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7","Type":"ContainerStarted","Data":"43991da9f7d31b02b9c057e3ef1d769990c6b3dfe3b2f8278bd03293d68fc6ea"} Nov 27 16:19:48 crc kubenswrapper[4707]: I1127 16:19:48.901117 4707 generic.go:334] "Generic (PLEG): container finished" podID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerID="422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e" exitCode=0 Nov 27 16:19:48 crc kubenswrapper[4707]: I1127 16:19:48.901156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b31a7b86-c43f-4123-a33d-ffba2ee3d015","Type":"ContainerDied","Data":"422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e"} Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.212689 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t7wqv"] Nov 27 16:19:49 crc kubenswrapper[4707]: E1127 16:19:49.213313 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d4145c-3144-4e1f-99ce-08d64f8b20be" containerName="swift-ring-rebalance" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.213332 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d4145c-3144-4e1f-99ce-08d64f8b20be" containerName="swift-ring-rebalance" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.213562 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d4145c-3144-4e1f-99ce-08d64f8b20be" containerName="swift-ring-rebalance" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.214914 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.227306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t7wqv"] Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.358037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-catalog-content\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.358115 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-utilities\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.358174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbl22\" (UniqueName: \"kubernetes.io/projected/79a68d54-eb44-474e-9c5f-1c2283a6d410-kube-api-access-rbl22\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.459941 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-catalog-content\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.460020 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-utilities\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.460060 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbl22\" (UniqueName: \"kubernetes.io/projected/79a68d54-eb44-474e-9c5f-1c2283a6d410-kube-api-access-rbl22\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.460506 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-catalog-content\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.460598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-utilities\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.475710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbl22\" (UniqueName: \"kubernetes.io/projected/79a68d54-eb44-474e-9c5f-1c2283a6d410-kube-api-access-rbl22\") pod \"redhat-operators-t7wqv\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.573298 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.916244 4707 generic.go:334] "Generic (PLEG): container finished" podID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerID="e079369df42d8158284ca1209b91d297a62f0ae1b6896e9351cb1a521d756f69" exitCode=0 Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.916650 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-57z78" event={"ID":"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7","Type":"ContainerDied","Data":"e079369df42d8158284ca1209b91d297a62f0ae1b6896e9351cb1a521d756f69"} Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.918916 4707 generic.go:334] "Generic (PLEG): container finished" podID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerID="df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60" exitCode=0 Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.918978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"517a2efb-7c9f-4c93-876b-5962da604ef8","Type":"ContainerDied","Data":"df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60"} Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.925276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b31a7b86-c43f-4123-a33d-ffba2ee3d015","Type":"ContainerStarted","Data":"9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e"} Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.925578 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 27 16:19:49 crc kubenswrapper[4707]: I1127 16:19:49.977972 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.430759555 podStartE2EDuration="59.977953874s" podCreationTimestamp="2025-11-27 16:18:50 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.060087102 +0000 UTC m=+920.691535870" lastFinishedPulling="2025-11-27 16:19:14.607281371 +0000 UTC m=+930.238730189" observedRunningTime="2025-11-27 16:19:49.962822096 +0000 UTC m=+965.594270864" watchObservedRunningTime="2025-11-27 16:19:49.977953874 +0000 UTC m=+965.609402642" Nov 27 16:19:50 crc kubenswrapper[4707]: I1127 16:19:50.044436 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t7wqv"] Nov 27 16:19:50 crc kubenswrapper[4707]: I1127 16:19:50.934472 4707 generic.go:334] "Generic (PLEG): container finished" podID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerID="23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164" exitCode=0 Nov 27 16:19:50 crc kubenswrapper[4707]: I1127 16:19:50.934565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wqv" event={"ID":"79a68d54-eb44-474e-9c5f-1c2283a6d410","Type":"ContainerDied","Data":"23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164"} Nov 27 16:19:50 crc kubenswrapper[4707]: I1127 16:19:50.934875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wqv" event={"ID":"79a68d54-eb44-474e-9c5f-1c2283a6d410","Type":"ContainerStarted","Data":"d250762f4c9305264cbd0bacb82d5c46ef04492b886e79ea4b850484d3090b47"} Nov 27 16:19:50 crc kubenswrapper[4707]: I1127 16:19:50.940882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"517a2efb-7c9f-4c93-876b-5962da604ef8","Type":"ContainerStarted","Data":"a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da"} Nov 27 16:19:50 crc kubenswrapper[4707]: I1127 16:19:50.941474 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:19:50 crc kubenswrapper[4707]: I1127 16:19:50.999846 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.61807786 podStartE2EDuration="1m0.999829627s" podCreationTimestamp="2025-11-27 16:18:50 +0000 UTC" firstStartedPulling="2025-11-27 16:19:05.487968144 +0000 UTC m=+921.119416912" lastFinishedPulling="2025-11-27 16:19:14.869719911 +0000 UTC m=+930.501168679" observedRunningTime="2025-11-27 16:19:50.997384627 +0000 UTC m=+966.628833415" watchObservedRunningTime="2025-11-27 16:19:50.999829627 +0000 UTC m=+966.631278395" Nov 27 16:19:51 crc kubenswrapper[4707]: I1127 16:19:51.034497 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vvkr6" podUID="9639769b-4439-4ffc-b88b-cba953013bff" containerName="ovn-controller" probeResult="failure" output=< Nov 27 16:19:51 crc kubenswrapper[4707]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 27 16:19:51 crc kubenswrapper[4707]: > Nov 27 16:19:51 crc kubenswrapper[4707]: I1127 16:19:51.062327 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:51 crc kubenswrapper[4707]: I1127 16:19:51.950274 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wqv" event={"ID":"79a68d54-eb44-474e-9c5f-1c2283a6d410","Type":"ContainerStarted","Data":"04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343"} Nov 27 16:19:51 crc kubenswrapper[4707]: I1127 16:19:51.953156 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-57z78" event={"ID":"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7","Type":"ContainerStarted","Data":"09a79b4ffc908cce27f180f6e1f536d89eb45ef0a847673cb27885e581f466b2"} Nov 27 16:19:51 crc kubenswrapper[4707]: I1127 16:19:51.993467 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-57z78" podStartSLOduration=3.037870568 podStartE2EDuration="5.993449421s" podCreationTimestamp="2025-11-27 16:19:46 +0000 UTC" firstStartedPulling="2025-11-27 16:19:47.885662668 +0000 UTC m=+963.517111446" lastFinishedPulling="2025-11-27 16:19:50.841241531 +0000 UTC m=+966.472690299" observedRunningTime="2025-11-27 16:19:51.989345042 +0000 UTC m=+967.620793810" watchObservedRunningTime="2025-11-27 16:19:51.993449421 +0000 UTC m=+967.624898189" Nov 27 16:19:52 crc kubenswrapper[4707]: I1127 16:19:52.966502 4707 generic.go:334] "Generic (PLEG): container finished" podID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerID="04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343" exitCode=0 Nov 27 16:19:52 crc kubenswrapper[4707]: I1127 16:19:52.966564 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wqv" event={"ID":"79a68d54-eb44-474e-9c5f-1c2283a6d410","Type":"ContainerDied","Data":"04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343"} Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.039544 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vvkr6" podUID="9639769b-4439-4ffc-b88b-cba953013bff" containerName="ovn-controller" probeResult="failure" output=< Nov 27 16:19:56 crc kubenswrapper[4707]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 27 16:19:56 crc kubenswrapper[4707]: > Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.071664 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7n9c2" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.269293 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vvkr6-config-psjmw"] Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.270552 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.272211 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.280056 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vvkr6-config-psjmw"] Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.410589 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.410729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-additional-scripts\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.411008 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cztbv\" (UniqueName: \"kubernetes.io/projected/c5e84399-e321-4a51-b22f-eecd3d68b744-kube-api-access-cztbv\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.411126 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-scripts\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.411155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run-ovn\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.411180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-log-ovn\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.512968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.513036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-additional-scripts\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.513108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cztbv\" (UniqueName: \"kubernetes.io/projected/c5e84399-e321-4a51-b22f-eecd3d68b744-kube-api-access-cztbv\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.513143 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-scripts\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.513159 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run-ovn\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.513175 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-log-ovn\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.513406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-log-ovn\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.513554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run-ovn\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.514167 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-additional-scripts\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.514255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.517077 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-scripts\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.534500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cztbv\" (UniqueName: \"kubernetes.io/projected/c5e84399-e321-4a51-b22f-eecd3d68b744-kube-api-access-cztbv\") pod \"ovn-controller-vvkr6-config-psjmw\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.572571 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.572786 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.585783 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:19:56 crc kubenswrapper[4707]: I1127 16:19:56.676248 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:57 crc kubenswrapper[4707]: I1127 16:19:57.049795 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:19:57 crc kubenswrapper[4707]: I1127 16:19:57.806676 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-57z78"] Nov 27 16:19:59 crc kubenswrapper[4707]: I1127 16:19:59.025139 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-57z78" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerName="registry-server" containerID="cri-o://09a79b4ffc908cce27f180f6e1f536d89eb45ef0a847673cb27885e581f466b2" gracePeriod=2 Nov 27 16:20:00 crc kubenswrapper[4707]: I1127 16:20:00.292930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:20:00 crc kubenswrapper[4707]: I1127 16:20:00.305545 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d67e130-f1e2-4fe8-9647-8725402a1cdd-etc-swift\") pod \"swift-storage-0\" (UID: \"4d67e130-f1e2-4fe8-9647-8725402a1cdd\") " pod="openstack/swift-storage-0" Nov 27 16:20:00 crc kubenswrapper[4707]: I1127 16:20:00.467723 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 27 16:20:01 crc kubenswrapper[4707]: I1127 16:20:01.058822 4707 generic.go:334] "Generic (PLEG): container finished" podID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerID="09a79b4ffc908cce27f180f6e1f536d89eb45ef0a847673cb27885e581f466b2" exitCode=0 Nov 27 16:20:01 crc kubenswrapper[4707]: I1127 16:20:01.059025 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-57z78" event={"ID":"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7","Type":"ContainerDied","Data":"09a79b4ffc908cce27f180f6e1f536d89eb45ef0a847673cb27885e581f466b2"} Nov 27 16:20:01 crc kubenswrapper[4707]: I1127 16:20:01.098143 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vvkr6" podUID="9639769b-4439-4ffc-b88b-cba953013bff" containerName="ovn-controller" probeResult="failure" output=< Nov 27 16:20:01 crc kubenswrapper[4707]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 27 16:20:01 crc kubenswrapper[4707]: > Nov 27 16:20:01 crc kubenswrapper[4707]: I1127 16:20:01.583676 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 27 16:20:01 crc kubenswrapper[4707]: I1127 16:20:01.866200 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Nov 27 16:20:01 crc kubenswrapper[4707]: I1127 16:20:01.966381 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-j98v9"] Nov 27 16:20:01 crc kubenswrapper[4707]: I1127 16:20:01.967395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-j98v9" Nov 27 16:20:01 crc kubenswrapper[4707]: I1127 16:20:01.975401 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-j98v9"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.074576 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h4wdg"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.075622 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.088834 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h4wdg"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.093594 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-336f-account-create-update-gtvgd"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.097316 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.099577 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.110423 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-336f-account-create-update-gtvgd"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.136748 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-operator-scripts\") pod \"heat-db-create-j98v9\" (UID: \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\") " pod="openstack/heat-db-create-j98v9" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.136831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dflvh\" (UniqueName: \"kubernetes.io/projected/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-kube-api-access-dflvh\") pod \"heat-db-create-j98v9\" (UID: \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\") " pod="openstack/heat-db-create-j98v9" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.172212 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-74xmp"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.173213 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.181959 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6f3e-account-create-update-95dd4"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.183483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.186813 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.188323 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-74xmp"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.197154 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6f3e-account-create-update-95dd4"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.238772 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7891f1-2643-4a2a-938b-80c0f25dac7c-operator-scripts\") pod \"heat-336f-account-create-update-gtvgd\" (UID: \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\") " pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.239467 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22nv5\" (UniqueName: \"kubernetes.io/projected/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-kube-api-access-22nv5\") pod \"cinder-db-create-h4wdg\" (UID: \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\") " pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.239615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dflvh\" (UniqueName: \"kubernetes.io/projected/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-kube-api-access-dflvh\") pod \"heat-db-create-j98v9\" (UID: \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\") " pod="openstack/heat-db-create-j98v9" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.239808 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-operator-scripts\") pod \"cinder-db-create-h4wdg\" (UID: \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\") " pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.240181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkc7s\" (UniqueName: \"kubernetes.io/projected/ed7891f1-2643-4a2a-938b-80c0f25dac7c-kube-api-access-vkc7s\") pod \"heat-336f-account-create-update-gtvgd\" (UID: \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\") " pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.240719 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-operator-scripts\") pod \"heat-db-create-j98v9\" (UID: \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\") " pod="openstack/heat-db-create-j98v9" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.241419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-operator-scripts\") pod \"heat-db-create-j98v9\" (UID: \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\") " pod="openstack/heat-db-create-j98v9" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.243069 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ktrrp"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.244026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.245889 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.245966 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.246141 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-drvd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.251015 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.257195 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ktrrp"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.271207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dflvh\" (UniqueName: \"kubernetes.io/projected/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-kube-api-access-dflvh\") pod \"heat-db-create-j98v9\" (UID: \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\") " pod="openstack/heat-db-create-j98v9" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.289658 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-j98v9" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.301862 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b0e5-account-create-update-k86pp"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.303202 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.306676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.317088 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b0e5-account-create-update-k86pp"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.341974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7891f1-2643-4a2a-938b-80c0f25dac7c-operator-scripts\") pod \"heat-336f-account-create-update-gtvgd\" (UID: \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\") " pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.342024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d232067-8a9f-4c42-a934-60560ad7d65c-operator-scripts\") pod \"barbican-db-create-74xmp\" (UID: \"2d232067-8a9f-4c42-a934-60560ad7d65c\") " pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.342057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22nv5\" (UniqueName: \"kubernetes.io/projected/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-kube-api-access-22nv5\") pod \"cinder-db-create-h4wdg\" (UID: \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\") " pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.342355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24172997-039c-47c2-b908-087dec03273f-operator-scripts\") pod \"barbican-6f3e-account-create-update-95dd4\" (UID: \"24172997-039c-47c2-b908-087dec03273f\") " pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.342438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79rhx\" (UniqueName: \"kubernetes.io/projected/24172997-039c-47c2-b908-087dec03273f-kube-api-access-79rhx\") pod \"barbican-6f3e-account-create-update-95dd4\" (UID: \"24172997-039c-47c2-b908-087dec03273f\") " pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.342487 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2xnk\" (UniqueName: \"kubernetes.io/projected/2d232067-8a9f-4c42-a934-60560ad7d65c-kube-api-access-c2xnk\") pod \"barbican-db-create-74xmp\" (UID: \"2d232067-8a9f-4c42-a934-60560ad7d65c\") " pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.342545 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-operator-scripts\") pod \"cinder-db-create-h4wdg\" (UID: \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\") " pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.342585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkc7s\" (UniqueName: \"kubernetes.io/projected/ed7891f1-2643-4a2a-938b-80c0f25dac7c-kube-api-access-vkc7s\") pod \"heat-336f-account-create-update-gtvgd\" (UID: \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\") " pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.342855 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7891f1-2643-4a2a-938b-80c0f25dac7c-operator-scripts\") pod \"heat-336f-account-create-update-gtvgd\" (UID: \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\") " pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.343207 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-operator-scripts\") pod \"cinder-db-create-h4wdg\" (UID: \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\") " pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.358706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkc7s\" (UniqueName: \"kubernetes.io/projected/ed7891f1-2643-4a2a-938b-80c0f25dac7c-kube-api-access-vkc7s\") pod \"heat-336f-account-create-update-gtvgd\" (UID: \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\") " pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.362047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22nv5\" (UniqueName: \"kubernetes.io/projected/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-kube-api-access-22nv5\") pod \"cinder-db-create-h4wdg\" (UID: \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\") " pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.391070 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.413034 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.444847 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-combined-ca-bundle\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.444937 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-config-data\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.444981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7fcab9a-5390-407b-afe4-753ec3be0120-operator-scripts\") pod \"cinder-b0e5-account-create-update-k86pp\" (UID: \"a7fcab9a-5390-407b-afe4-753ec3be0120\") " pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.445035 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjhb\" (UniqueName: \"kubernetes.io/projected/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-kube-api-access-kzjhb\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.445103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d232067-8a9f-4c42-a934-60560ad7d65c-operator-scripts\") pod \"barbican-db-create-74xmp\" (UID: \"2d232067-8a9f-4c42-a934-60560ad7d65c\") " pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.445221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24172997-039c-47c2-b908-087dec03273f-operator-scripts\") pod \"barbican-6f3e-account-create-update-95dd4\" (UID: \"24172997-039c-47c2-b908-087dec03273f\") " pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.445302 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpkhg\" (UniqueName: \"kubernetes.io/projected/a7fcab9a-5390-407b-afe4-753ec3be0120-kube-api-access-hpkhg\") pod \"cinder-b0e5-account-create-update-k86pp\" (UID: \"a7fcab9a-5390-407b-afe4-753ec3be0120\") " pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.445487 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79rhx\" (UniqueName: \"kubernetes.io/projected/24172997-039c-47c2-b908-087dec03273f-kube-api-access-79rhx\") pod \"barbican-6f3e-account-create-update-95dd4\" (UID: \"24172997-039c-47c2-b908-087dec03273f\") " pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.445597 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2xnk\" (UniqueName: \"kubernetes.io/projected/2d232067-8a9f-4c42-a934-60560ad7d65c-kube-api-access-c2xnk\") pod \"barbican-db-create-74xmp\" (UID: \"2d232067-8a9f-4c42-a934-60560ad7d65c\") " pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.446500 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d232067-8a9f-4c42-a934-60560ad7d65c-operator-scripts\") pod \"barbican-db-create-74xmp\" (UID: \"2d232067-8a9f-4c42-a934-60560ad7d65c\") " pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.446873 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24172997-039c-47c2-b908-087dec03273f-operator-scripts\") pod \"barbican-6f3e-account-create-update-95dd4\" (UID: \"24172997-039c-47c2-b908-087dec03273f\") " pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.465517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79rhx\" (UniqueName: \"kubernetes.io/projected/24172997-039c-47c2-b908-087dec03273f-kube-api-access-79rhx\") pod \"barbican-6f3e-account-create-update-95dd4\" (UID: \"24172997-039c-47c2-b908-087dec03273f\") " pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.465730 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2xnk\" (UniqueName: \"kubernetes.io/projected/2d232067-8a9f-4c42-a934-60560ad7d65c-kube-api-access-c2xnk\") pod \"barbican-db-create-74xmp\" (UID: \"2d232067-8a9f-4c42-a934-60560ad7d65c\") " pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.486503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.499050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.517497 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a71a-account-create-update-tqs56"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.518692 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.522770 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.526749 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a71a-account-create-update-tqs56"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.547595 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkhg\" (UniqueName: \"kubernetes.io/projected/a7fcab9a-5390-407b-afe4-753ec3be0120-kube-api-access-hpkhg\") pod \"cinder-b0e5-account-create-update-k86pp\" (UID: \"a7fcab9a-5390-407b-afe4-753ec3be0120\") " pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.547665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqw7\" (UniqueName: \"kubernetes.io/projected/f0b458d9-3955-421a-bfa6-30fca174692a-kube-api-access-6jqw7\") pod \"neutron-a71a-account-create-update-tqs56\" (UID: \"f0b458d9-3955-421a-bfa6-30fca174692a\") " pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.547757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-combined-ca-bundle\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.547817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-config-data\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.547845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7fcab9a-5390-407b-afe4-753ec3be0120-operator-scripts\") pod \"cinder-b0e5-account-create-update-k86pp\" (UID: \"a7fcab9a-5390-407b-afe4-753ec3be0120\") " pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.547884 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjhb\" (UniqueName: \"kubernetes.io/projected/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-kube-api-access-kzjhb\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.548783 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b458d9-3955-421a-bfa6-30fca174692a-operator-scripts\") pod \"neutron-a71a-account-create-update-tqs56\" (UID: \"f0b458d9-3955-421a-bfa6-30fca174692a\") " pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.550327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7fcab9a-5390-407b-afe4-753ec3be0120-operator-scripts\") pod \"cinder-b0e5-account-create-update-k86pp\" (UID: \"a7fcab9a-5390-407b-afe4-753ec3be0120\") " pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.551281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-combined-ca-bundle\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.552865 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-config-data\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.575768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjhb\" (UniqueName: \"kubernetes.io/projected/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-kube-api-access-kzjhb\") pod \"keystone-db-sync-ktrrp\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.579150 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpkhg\" (UniqueName: \"kubernetes.io/projected/a7fcab9a-5390-407b-afe4-753ec3be0120-kube-api-access-hpkhg\") pod \"cinder-b0e5-account-create-update-k86pp\" (UID: \"a7fcab9a-5390-407b-afe4-753ec3be0120\") " pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.590410 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7757b"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.591394 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7757b" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.619519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.628876 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7757b"] Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.650183 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddwb5\" (UniqueName: \"kubernetes.io/projected/92423d7b-7233-418c-b1f2-5516a6c7c2a3-kube-api-access-ddwb5\") pod \"neutron-db-create-7757b\" (UID: \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\") " pod="openstack/neutron-db-create-7757b" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.650263 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92423d7b-7233-418c-b1f2-5516a6c7c2a3-operator-scripts\") pod \"neutron-db-create-7757b\" (UID: \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\") " pod="openstack/neutron-db-create-7757b" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.650309 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b458d9-3955-421a-bfa6-30fca174692a-operator-scripts\") pod \"neutron-a71a-account-create-update-tqs56\" (UID: \"f0b458d9-3955-421a-bfa6-30fca174692a\") " pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.650362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqw7\" (UniqueName: \"kubernetes.io/projected/f0b458d9-3955-421a-bfa6-30fca174692a-kube-api-access-6jqw7\") pod \"neutron-a71a-account-create-update-tqs56\" (UID: \"f0b458d9-3955-421a-bfa6-30fca174692a\") " pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.651490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b458d9-3955-421a-bfa6-30fca174692a-operator-scripts\") pod \"neutron-a71a-account-create-update-tqs56\" (UID: \"f0b458d9-3955-421a-bfa6-30fca174692a\") " pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.688842 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqw7\" (UniqueName: \"kubernetes.io/projected/f0b458d9-3955-421a-bfa6-30fca174692a-kube-api-access-6jqw7\") pod \"neutron-a71a-account-create-update-tqs56\" (UID: \"f0b458d9-3955-421a-bfa6-30fca174692a\") " pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.751766 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92423d7b-7233-418c-b1f2-5516a6c7c2a3-operator-scripts\") pod \"neutron-db-create-7757b\" (UID: \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\") " pod="openstack/neutron-db-create-7757b" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.751876 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddwb5\" (UniqueName: \"kubernetes.io/projected/92423d7b-7233-418c-b1f2-5516a6c7c2a3-kube-api-access-ddwb5\") pod \"neutron-db-create-7757b\" (UID: \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\") " pod="openstack/neutron-db-create-7757b" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.752862 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92423d7b-7233-418c-b1f2-5516a6c7c2a3-operator-scripts\") pod \"neutron-db-create-7757b\" (UID: \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\") " pod="openstack/neutron-db-create-7757b" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.766478 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddwb5\" (UniqueName: \"kubernetes.io/projected/92423d7b-7233-418c-b1f2-5516a6c7c2a3-kube-api-access-ddwb5\") pod \"neutron-db-create-7757b\" (UID: \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\") " pod="openstack/neutron-db-create-7757b" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.837773 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.857790 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:02 crc kubenswrapper[4707]: I1127 16:20:02.910239 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7757b" Nov 27 16:20:03 crc kubenswrapper[4707]: I1127 16:20:03.623775 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:20:03 crc kubenswrapper[4707]: I1127 16:20:03.624649 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:20:03 crc kubenswrapper[4707]: I1127 16:20:03.624782 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:20:03 crc kubenswrapper[4707]: I1127 16:20:03.625495 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8aa0ec55553e2030c537e5b750cef10ee68d7cb3cbe0ae6f95e1e594b84cdc37"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:20:03 crc kubenswrapper[4707]: I1127 16:20:03.625624 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://8aa0ec55553e2030c537e5b750cef10ee68d7cb3cbe0ae6f95e1e594b84cdc37" gracePeriod=600 Nov 27 16:20:04 crc kubenswrapper[4707]: I1127 16:20:04.970172 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:04.992416 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-utilities\") pod \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:04.992539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qdm4\" (UniqueName: \"kubernetes.io/projected/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-kube-api-access-4qdm4\") pod \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:04.994171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-catalog-content\") pod \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\" (UID: \"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7\") " Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:04.994675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-utilities" (OuterVolumeSpecName: "utilities") pod "83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" (UID: "83c9f54d-8957-42f2-9b0a-24b4cc72a2a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.001789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-kube-api-access-4qdm4" (OuterVolumeSpecName: "kube-api-access-4qdm4") pod "83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" (UID: "83c9f54d-8957-42f2-9b0a-24b4cc72a2a7"). InnerVolumeSpecName "kube-api-access-4qdm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.009995 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.010563 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qdm4\" (UniqueName: \"kubernetes.io/projected/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-kube-api-access-4qdm4\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.011050 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" (UID: "83c9f54d-8957-42f2-9b0a-24b4cc72a2a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.112514 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.126164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-57z78" event={"ID":"83c9f54d-8957-42f2-9b0a-24b4cc72a2a7","Type":"ContainerDied","Data":"43991da9f7d31b02b9c057e3ef1d769990c6b3dfe3b2f8278bd03293d68fc6ea"} Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.126214 4707 scope.go:117] "RemoveContainer" containerID="09a79b4ffc908cce27f180f6e1f536d89eb45ef0a847673cb27885e581f466b2" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.126343 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-57z78" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.159137 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vvkr6-config-psjmw"] Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.159497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"8aa0ec55553e2030c537e5b750cef10ee68d7cb3cbe0ae6f95e1e594b84cdc37"} Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.159383 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="8aa0ec55553e2030c537e5b750cef10ee68d7cb3cbe0ae6f95e1e594b84cdc37" exitCode=0 Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.225478 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-57z78"] Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.225518 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-57z78"] Nov 27 16:20:05 crc kubenswrapper[4707]: W1127 16:20:05.271112 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5e84399_e321_4a51_b22f_eecd3d68b744.slice/crio-694d08ef09f6f96203b6a1aebcbc75fc83d666e428a5ccf43c854ed56dab7ce3 WatchSource:0}: Error finding container 694d08ef09f6f96203b6a1aebcbc75fc83d666e428a5ccf43c854ed56dab7ce3: Status 404 returned error can't find the container with id 694d08ef09f6f96203b6a1aebcbc75fc83d666e428a5ccf43c854ed56dab7ce3 Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.381566 4707 scope.go:117] "RemoveContainer" containerID="e079369df42d8158284ca1209b91d297a62f0ae1b6896e9351cb1a521d756f69" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.506316 4707 scope.go:117] "RemoveContainer" containerID="29c2ffd9a15e55284540a9d1c0e190a93ba2cece0d7c499b437ae711000839bc" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.575651 4707 scope.go:117] "RemoveContainer" containerID="596e4dd118b814e12fbaccbb4655af72f02c2baaf706c8463cf822841fdaa729" Nov 27 16:20:05 crc kubenswrapper[4707]: I1127 16:20:05.690097 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 27 16:20:05 crc kubenswrapper[4707]: W1127 16:20:05.705070 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d67e130_f1e2_4fe8_9647_8725402a1cdd.slice/crio-a23e4d0af55aa4b31bf53a1d6695ceb6dc800afb05db9e8d1ef89339bf4d5043 WatchSource:0}: Error finding container a23e4d0af55aa4b31bf53a1d6695ceb6dc800afb05db9e8d1ef89339bf4d5043: Status 404 returned error can't find the container with id a23e4d0af55aa4b31bf53a1d6695ceb6dc800afb05db9e8d1ef89339bf4d5043 Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.038851 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vvkr6" Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.118140 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-74xmp"] Nov 27 16:20:06 crc kubenswrapper[4707]: W1127 16:20:06.120070 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7fcab9a_5390_407b_afe4_753ec3be0120.slice/crio-91fd193d46db027a3df5b4cb3ba49a8c8ec59b367881e09e1f7824ba4edac393 WatchSource:0}: Error finding container 91fd193d46db027a3df5b4cb3ba49a8c8ec59b367881e09e1f7824ba4edac393: Status 404 returned error can't find the container with id 91fd193d46db027a3df5b4cb3ba49a8c8ec59b367881e09e1f7824ba4edac393 Nov 27 16:20:06 crc kubenswrapper[4707]: W1127 16:20:06.124231 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92423d7b_7233_418c_b1f2_5516a6c7c2a3.slice/crio-e6dfeda7cc0a3bf057cc0cea2954be40af624c15ec2d2bb37805252b8249ff35 WatchSource:0}: Error finding container e6dfeda7cc0a3bf057cc0cea2954be40af624c15ec2d2bb37805252b8249ff35: Status 404 returned error can't find the container with id e6dfeda7cc0a3bf057cc0cea2954be40af624c15ec2d2bb37805252b8249ff35 Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.142573 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b0e5-account-create-update-k86pp"] Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.151711 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7757b"] Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.158810 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6f3e-account-create-update-95dd4"] Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.172540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-336f-account-create-update-gtvgd"] Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.189959 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ktrrp"] Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.191966 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-74xmp" event={"ID":"2d232067-8a9f-4c42-a934-60560ad7d65c","Type":"ContainerStarted","Data":"1aa0879c54958d0ab61362538d09d87508511feb46a406d62f24ee927e148a05"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.196522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b0e5-account-create-update-k86pp" event={"ID":"a7fcab9a-5390-407b-afe4-753ec3be0120","Type":"ContainerStarted","Data":"91fd193d46db027a3df5b4cb3ba49a8c8ec59b367881e09e1f7824ba4edac393"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.225218 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-j98v9"] Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.230145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7757b" event={"ID":"92423d7b-7233-418c-b1f2-5516a6c7c2a3","Type":"ContainerStarted","Data":"e6dfeda7cc0a3bf057cc0cea2954be40af624c15ec2d2bb37805252b8249ff35"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.237983 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a71a-account-create-update-tqs56"] Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.243868 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"a23e4d0af55aa4b31bf53a1d6695ceb6dc800afb05db9e8d1ef89339bf4d5043"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.250008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"a80c79e17bd42a14677ea7ec5718ee1c93082c9c4030211d42f0b9a8e6591e20"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.254578 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vvkr6-config-psjmw" event={"ID":"c5e84399-e321-4a51-b22f-eecd3d68b744","Type":"ContainerStarted","Data":"879c1f13cbf8a83e5d4ee2dcc71894cb59b883cd2f050b66804241c0411b4167"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.254604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vvkr6-config-psjmw" event={"ID":"c5e84399-e321-4a51-b22f-eecd3d68b744","Type":"ContainerStarted","Data":"694d08ef09f6f96203b6a1aebcbc75fc83d666e428a5ccf43c854ed56dab7ce3"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.258206 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-336f-account-create-update-gtvgd" event={"ID":"ed7891f1-2643-4a2a-938b-80c0f25dac7c","Type":"ContainerStarted","Data":"0e8535f703128786f9e7cb232a47766c83ed1b995d3f6bbf24c98157c8f00229"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.258845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h4wdg"] Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.268541 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f3e-account-create-update-95dd4" event={"ID":"24172997-039c-47c2-b908-087dec03273f","Type":"ContainerStarted","Data":"cb79e3bfa4cdf739cf5640487e1d6569924adee2940c024ceb44b387aa7afcfc"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.287983 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wqv" event={"ID":"79a68d54-eb44-474e-9c5f-1c2283a6d410","Type":"ContainerStarted","Data":"b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2"} Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.303538 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vvkr6-config-psjmw" podStartSLOduration=10.303523278 podStartE2EDuration="10.303523278s" podCreationTimestamp="2025-11-27 16:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:06.300880744 +0000 UTC m=+981.932329512" watchObservedRunningTime="2025-11-27 16:20:06.303523278 +0000 UTC m=+981.934972046" Nov 27 16:20:06 crc kubenswrapper[4707]: I1127 16:20:06.320267 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t7wqv" podStartSLOduration=3.004352485 podStartE2EDuration="17.320236114s" podCreationTimestamp="2025-11-27 16:19:49 +0000 UTC" firstStartedPulling="2025-11-27 16:19:50.936028215 +0000 UTC m=+966.567476973" lastFinishedPulling="2025-11-27 16:20:05.251911834 +0000 UTC m=+980.883360602" observedRunningTime="2025-11-27 16:20:06.31510358 +0000 UTC m=+981.946552348" watchObservedRunningTime="2025-11-27 16:20:06.320236114 +0000 UTC m=+981.951684912" Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.206255 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" path="/var/lib/kubelet/pods/83c9f54d-8957-42f2-9b0a-24b4cc72a2a7/volumes" Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.344326 4707 generic.go:334] "Generic (PLEG): container finished" podID="2d232067-8a9f-4c42-a934-60560ad7d65c" containerID="8391a045d787fbef1ee1469d40cf848820d1aee0bcb66251f8664f5d9ff77e8b" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.344400 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-74xmp" event={"ID":"2d232067-8a9f-4c42-a934-60560ad7d65c","Type":"ContainerDied","Data":"8391a045d787fbef1ee1469d40cf848820d1aee0bcb66251f8664f5d9ff77e8b"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.347477 4707 generic.go:334] "Generic (PLEG): container finished" podID="24172997-039c-47c2-b908-087dec03273f" containerID="02af4020f961083759e82e6bf1c4afc8f5d8931d14de9d0fe97a32a4c76df59d" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.347522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f3e-account-create-update-95dd4" event={"ID":"24172997-039c-47c2-b908-087dec03273f","Type":"ContainerDied","Data":"02af4020f961083759e82e6bf1c4afc8f5d8931d14de9d0fe97a32a4c76df59d"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.348998 4707 generic.go:334] "Generic (PLEG): container finished" podID="5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb" containerID="5c6ab1ed28c867d7cd8dc1309c3bdd74c8abca787c49cf0f050d4af0806195b9" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.349035 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h4wdg" event={"ID":"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb","Type":"ContainerDied","Data":"5c6ab1ed28c867d7cd8dc1309c3bdd74c8abca787c49cf0f050d4af0806195b9"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.349050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h4wdg" event={"ID":"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb","Type":"ContainerStarted","Data":"eb692f5b891d763d176169dc13fdba7caad2a4c87266350a135be50f0daa1f0b"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.350313 4707 generic.go:334] "Generic (PLEG): container finished" podID="82a48402-5e5b-4427-b4f3-d28d7eb0e61e" containerID="ca393106b78acc7aea102559bc24dc04a67e20c03b0fdf075acac63412827afd" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.350351 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-j98v9" event={"ID":"82a48402-5e5b-4427-b4f3-d28d7eb0e61e","Type":"ContainerDied","Data":"ca393106b78acc7aea102559bc24dc04a67e20c03b0fdf075acac63412827afd"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.350367 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-j98v9" event={"ID":"82a48402-5e5b-4427-b4f3-d28d7eb0e61e","Type":"ContainerStarted","Data":"eefc1430071805dbeb5dfa91842df677db4877873e04b7e0fc6049f37b8c78ae"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.354981 4707 generic.go:334] "Generic (PLEG): container finished" podID="92423d7b-7233-418c-b1f2-5516a6c7c2a3" containerID="d1d9eeabeb86b57cf9580ba9008e25b2c8f38628b857cac062a81dc81853676f" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.355024 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7757b" event={"ID":"92423d7b-7233-418c-b1f2-5516a6c7c2a3","Type":"ContainerDied","Data":"d1d9eeabeb86b57cf9580ba9008e25b2c8f38628b857cac062a81dc81853676f"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.361019 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0b458d9-3955-421a-bfa6-30fca174692a" containerID="9fd33f95f4dd2b21d7cb3472a03f5e3aacded216d66201c5195eac855a4eceea" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.361069 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a71a-account-create-update-tqs56" event={"ID":"f0b458d9-3955-421a-bfa6-30fca174692a","Type":"ContainerDied","Data":"9fd33f95f4dd2b21d7cb3472a03f5e3aacded216d66201c5195eac855a4eceea"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.361091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a71a-account-create-update-tqs56" event={"ID":"f0b458d9-3955-421a-bfa6-30fca174692a","Type":"ContainerStarted","Data":"0e09697260f567a9b05ce73e09219655ef86eb4970c75e61c48928665d35cc17"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.362355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ktrrp" event={"ID":"a6c5bb46-81d2-4478-8a44-42c46ddaaffa","Type":"ContainerStarted","Data":"ba2ef2ff00eec2c5a5232895e19e0e65cb090969deeec0636244f305897b6049"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.365866 4707 generic.go:334] "Generic (PLEG): container finished" podID="c5e84399-e321-4a51-b22f-eecd3d68b744" containerID="879c1f13cbf8a83e5d4ee2dcc71894cb59b883cd2f050b66804241c0411b4167" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.365941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vvkr6-config-psjmw" event={"ID":"c5e84399-e321-4a51-b22f-eecd3d68b744","Type":"ContainerDied","Data":"879c1f13cbf8a83e5d4ee2dcc71894cb59b883cd2f050b66804241c0411b4167"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.367399 4707 generic.go:334] "Generic (PLEG): container finished" podID="ed7891f1-2643-4a2a-938b-80c0f25dac7c" containerID="0c77e1e676d1eda2676c31f70f9fa0e22adc040d185d4a1eb3328f049783e768" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.367468 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-336f-account-create-update-gtvgd" event={"ID":"ed7891f1-2643-4a2a-938b-80c0f25dac7c","Type":"ContainerDied","Data":"0c77e1e676d1eda2676c31f70f9fa0e22adc040d185d4a1eb3328f049783e768"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.374199 4707 generic.go:334] "Generic (PLEG): container finished" podID="a7fcab9a-5390-407b-afe4-753ec3be0120" containerID="85b3642d337b8c6304faa955e1fc471d88aecc0647076f13e6822c3bed85ebd8" exitCode=0 Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.374266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b0e5-account-create-update-k86pp" event={"ID":"a7fcab9a-5390-407b-afe4-753ec3be0120","Type":"ContainerDied","Data":"85b3642d337b8c6304faa955e1fc471d88aecc0647076f13e6822c3bed85ebd8"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.378318 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kdv4k" event={"ID":"09fb5cc3-1c55-459c-aa89-63f13e8f97f4","Type":"ContainerStarted","Data":"4ee16c740caac328bd7fff69ecf2e731609e164c15d991981d2a81d2b9d40c1c"} Nov 27 16:20:07 crc kubenswrapper[4707]: I1127 16:20:07.475562 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-kdv4k" podStartSLOduration=4.146288646 podStartE2EDuration="22.475544101s" podCreationTimestamp="2025-11-27 16:19:45 +0000 UTC" firstStartedPulling="2025-11-27 16:19:46.737345583 +0000 UTC m=+962.368794351" lastFinishedPulling="2025-11-27 16:20:05.066601038 +0000 UTC m=+980.698049806" observedRunningTime="2025-11-27 16:20:07.469990296 +0000 UTC m=+983.101439064" watchObservedRunningTime="2025-11-27 16:20:07.475544101 +0000 UTC m=+983.106992869" Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.393209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"b70b20cad5e5da42d093b1fc4f894b6f0dbe6258716c2225b7cf19fe3e6114e9"} Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.393484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"dbb5a2cd4215b0b053c24a1d78eea7adc8a171ba40436f1d98edfc89d2d7484b"} Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.393522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"adbe8fad27e06918f5fb246ed79c9526dfc09e4fb41b81f3c6a10c8c4c2b93f4"} Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.846492 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7757b" Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.882665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddwb5\" (UniqueName: \"kubernetes.io/projected/92423d7b-7233-418c-b1f2-5516a6c7c2a3-kube-api-access-ddwb5\") pod \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\" (UID: \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\") " Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.882841 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92423d7b-7233-418c-b1f2-5516a6c7c2a3-operator-scripts\") pod \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\" (UID: \"92423d7b-7233-418c-b1f2-5516a6c7c2a3\") " Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.883471 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92423d7b-7233-418c-b1f2-5516a6c7c2a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92423d7b-7233-418c-b1f2-5516a6c7c2a3" (UID: "92423d7b-7233-418c-b1f2-5516a6c7c2a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.909709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92423d7b-7233-418c-b1f2-5516a6c7c2a3-kube-api-access-ddwb5" (OuterVolumeSpecName: "kube-api-access-ddwb5") pod "92423d7b-7233-418c-b1f2-5516a6c7c2a3" (UID: "92423d7b-7233-418c-b1f2-5516a6c7c2a3"). InnerVolumeSpecName "kube-api-access-ddwb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.985321 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddwb5\" (UniqueName: \"kubernetes.io/projected/92423d7b-7233-418c-b1f2-5516a6c7c2a3-kube-api-access-ddwb5\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:08 crc kubenswrapper[4707]: I1127 16:20:08.985353 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92423d7b-7233-418c-b1f2-5516a6c7c2a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.202571 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.205208 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.225206 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.226095 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.226198 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.244081 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.252013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-j98v9" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.257799 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.288757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-operator-scripts\") pod \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\" (UID: \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.288797 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22nv5\" (UniqueName: \"kubernetes.io/projected/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-kube-api-access-22nv5\") pod \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\" (UID: \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.288827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-additional-scripts\") pod \"c5e84399-e321-4a51-b22f-eecd3d68b744\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.288851 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpkhg\" (UniqueName: \"kubernetes.io/projected/a7fcab9a-5390-407b-afe4-753ec3be0120-kube-api-access-hpkhg\") pod \"a7fcab9a-5390-407b-afe4-753ec3be0120\" (UID: \"a7fcab9a-5390-407b-afe4-753ec3be0120\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.290021 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82a48402-5e5b-4427-b4f3-d28d7eb0e61e" (UID: "82a48402-5e5b-4427-b4f3-d28d7eb0e61e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.290483 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7fcab9a-5390-407b-afe4-753ec3be0120-operator-scripts\") pod \"a7fcab9a-5390-407b-afe4-753ec3be0120\" (UID: \"a7fcab9a-5390-407b-afe4-753ec3be0120\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.290537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2xnk\" (UniqueName: \"kubernetes.io/projected/2d232067-8a9f-4c42-a934-60560ad7d65c-kube-api-access-c2xnk\") pod \"2d232067-8a9f-4c42-a934-60560ad7d65c\" (UID: \"2d232067-8a9f-4c42-a934-60560ad7d65c\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.290584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c5e84399-e321-4a51-b22f-eecd3d68b744" (UID: "c5e84399-e321-4a51-b22f-eecd3d68b744"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.290597 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-scripts\") pod \"c5e84399-e321-4a51-b22f-eecd3d68b744\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.290615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b458d9-3955-421a-bfa6-30fca174692a-operator-scripts\") pod \"f0b458d9-3955-421a-bfa6-30fca174692a\" (UID: \"f0b458d9-3955-421a-bfa6-30fca174692a\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.290848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d232067-8a9f-4c42-a934-60560ad7d65c-operator-scripts\") pod \"2d232067-8a9f-4c42-a934-60560ad7d65c\" (UID: \"2d232067-8a9f-4c42-a934-60560ad7d65c\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.290881 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7891f1-2643-4a2a-938b-80c0f25dac7c-operator-scripts\") pod \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\" (UID: \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jqw7\" (UniqueName: \"kubernetes.io/projected/f0b458d9-3955-421a-bfa6-30fca174692a-kube-api-access-6jqw7\") pod \"f0b458d9-3955-421a-bfa6-30fca174692a\" (UID: \"f0b458d9-3955-421a-bfa6-30fca174692a\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79rhx\" (UniqueName: \"kubernetes.io/projected/24172997-039c-47c2-b908-087dec03273f-kube-api-access-79rhx\") pod \"24172997-039c-47c2-b908-087dec03273f\" (UID: \"24172997-039c-47c2-b908-087dec03273f\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292487 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dflvh\" (UniqueName: \"kubernetes.io/projected/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-kube-api-access-dflvh\") pod \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\" (UID: \"82a48402-5e5b-4427-b4f3-d28d7eb0e61e\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24172997-039c-47c2-b908-087dec03273f-operator-scripts\") pod \"24172997-039c-47c2-b908-087dec03273f\" (UID: \"24172997-039c-47c2-b908-087dec03273f\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292541 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkc7s\" (UniqueName: \"kubernetes.io/projected/ed7891f1-2643-4a2a-938b-80c0f25dac7c-kube-api-access-vkc7s\") pod \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\" (UID: \"ed7891f1-2643-4a2a-938b-80c0f25dac7c\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292570 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run\") pod \"c5e84399-e321-4a51-b22f-eecd3d68b744\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cztbv\" (UniqueName: \"kubernetes.io/projected/c5e84399-e321-4a51-b22f-eecd3d68b744-kube-api-access-cztbv\") pod \"c5e84399-e321-4a51-b22f-eecd3d68b744\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-log-ovn\") pod \"c5e84399-e321-4a51-b22f-eecd3d68b744\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run-ovn\") pod \"c5e84399-e321-4a51-b22f-eecd3d68b744\" (UID: \"c5e84399-e321-4a51-b22f-eecd3d68b744\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292679 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-operator-scripts\") pod \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\" (UID: \"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb\") " Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.293473 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.293490 4707 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292047 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-scripts" (OuterVolumeSpecName: "scripts") pod "c5e84399-e321-4a51-b22f-eecd3d68b744" (UID: "c5e84399-e321-4a51-b22f-eecd3d68b744"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.292231 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7fcab9a-5390-407b-afe4-753ec3be0120-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7fcab9a-5390-407b-afe4-753ec3be0120" (UID: "a7fcab9a-5390-407b-afe4-753ec3be0120"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.293593 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b458d9-3955-421a-bfa6-30fca174692a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0b458d9-3955-421a-bfa6-30fca174692a" (UID: "f0b458d9-3955-421a-bfa6-30fca174692a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.294094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run" (OuterVolumeSpecName: "var-run") pod "c5e84399-e321-4a51-b22f-eecd3d68b744" (UID: "c5e84399-e321-4a51-b22f-eecd3d68b744"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.294780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c5e84399-e321-4a51-b22f-eecd3d68b744" (UID: "c5e84399-e321-4a51-b22f-eecd3d68b744"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.294806 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c5e84399-e321-4a51-b22f-eecd3d68b744" (UID: "c5e84399-e321-4a51-b22f-eecd3d68b744"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.295152 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d232067-8a9f-4c42-a934-60560ad7d65c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d232067-8a9f-4c42-a934-60560ad7d65c" (UID: "2d232067-8a9f-4c42-a934-60560ad7d65c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.295352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fcab9a-5390-407b-afe4-753ec3be0120-kube-api-access-hpkhg" (OuterVolumeSpecName: "kube-api-access-hpkhg") pod "a7fcab9a-5390-407b-afe4-753ec3be0120" (UID: "a7fcab9a-5390-407b-afe4-753ec3be0120"). InnerVolumeSpecName "kube-api-access-hpkhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.295675 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24172997-039c-47c2-b908-087dec03273f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24172997-039c-47c2-b908-087dec03273f" (UID: "24172997-039c-47c2-b908-087dec03273f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.295760 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb" (UID: "5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.296344 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-kube-api-access-22nv5" (OuterVolumeSpecName: "kube-api-access-22nv5") pod "5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb" (UID: "5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb"). InnerVolumeSpecName "kube-api-access-22nv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.296865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed7891f1-2643-4a2a-938b-80c0f25dac7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ed7891f1-2643-4a2a-938b-80c0f25dac7c" (UID: "ed7891f1-2643-4a2a-938b-80c0f25dac7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.297381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24172997-039c-47c2-b908-087dec03273f-kube-api-access-79rhx" (OuterVolumeSpecName: "kube-api-access-79rhx") pod "24172997-039c-47c2-b908-087dec03273f" (UID: "24172997-039c-47c2-b908-087dec03273f"). InnerVolumeSpecName "kube-api-access-79rhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.298182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7891f1-2643-4a2a-938b-80c0f25dac7c-kube-api-access-vkc7s" (OuterVolumeSpecName: "kube-api-access-vkc7s") pod "ed7891f1-2643-4a2a-938b-80c0f25dac7c" (UID: "ed7891f1-2643-4a2a-938b-80c0f25dac7c"). InnerVolumeSpecName "kube-api-access-vkc7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.303672 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b458d9-3955-421a-bfa6-30fca174692a-kube-api-access-6jqw7" (OuterVolumeSpecName: "kube-api-access-6jqw7") pod "f0b458d9-3955-421a-bfa6-30fca174692a" (UID: "f0b458d9-3955-421a-bfa6-30fca174692a"). InnerVolumeSpecName "kube-api-access-6jqw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.303705 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-kube-api-access-dflvh" (OuterVolumeSpecName: "kube-api-access-dflvh") pod "82a48402-5e5b-4427-b4f3-d28d7eb0e61e" (UID: "82a48402-5e5b-4427-b4f3-d28d7eb0e61e"). InnerVolumeSpecName "kube-api-access-dflvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.303753 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5e84399-e321-4a51-b22f-eecd3d68b744-kube-api-access-cztbv" (OuterVolumeSpecName: "kube-api-access-cztbv") pod "c5e84399-e321-4a51-b22f-eecd3d68b744" (UID: "c5e84399-e321-4a51-b22f-eecd3d68b744"). InnerVolumeSpecName "kube-api-access-cztbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.308020 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d232067-8a9f-4c42-a934-60560ad7d65c-kube-api-access-c2xnk" (OuterVolumeSpecName: "kube-api-access-c2xnk") pod "2d232067-8a9f-4c42-a934-60560ad7d65c" (UID: "2d232067-8a9f-4c42-a934-60560ad7d65c"). InnerVolumeSpecName "kube-api-access-c2xnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.394997 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5e84399-e321-4a51-b22f-eecd3d68b744-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395238 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0b458d9-3955-421a-bfa6-30fca174692a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395251 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d232067-8a9f-4c42-a934-60560ad7d65c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395261 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ed7891f1-2643-4a2a-938b-80c0f25dac7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395272 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jqw7\" (UniqueName: \"kubernetes.io/projected/f0b458d9-3955-421a-bfa6-30fca174692a-kube-api-access-6jqw7\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395282 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79rhx\" (UniqueName: \"kubernetes.io/projected/24172997-039c-47c2-b908-087dec03273f-kube-api-access-79rhx\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395291 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dflvh\" (UniqueName: \"kubernetes.io/projected/82a48402-5e5b-4427-b4f3-d28d7eb0e61e-kube-api-access-dflvh\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395397 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24172997-039c-47c2-b908-087dec03273f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395409 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkc7s\" (UniqueName: \"kubernetes.io/projected/ed7891f1-2643-4a2a-938b-80c0f25dac7c-kube-api-access-vkc7s\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395417 4707 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395426 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cztbv\" (UniqueName: \"kubernetes.io/projected/c5e84399-e321-4a51-b22f-eecd3d68b744-kube-api-access-cztbv\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395436 4707 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395444 4707 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e84399-e321-4a51-b22f-eecd3d68b744-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395452 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395460 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22nv5\" (UniqueName: \"kubernetes.io/projected/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb-kube-api-access-22nv5\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395469 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpkhg\" (UniqueName: \"kubernetes.io/projected/a7fcab9a-5390-407b-afe4-753ec3be0120-kube-api-access-hpkhg\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395476 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7fcab9a-5390-407b-afe4-753ec3be0120-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.395485 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2xnk\" (UniqueName: \"kubernetes.io/projected/2d232067-8a9f-4c42-a934-60560ad7d65c-kube-api-access-c2xnk\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.416888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7757b" event={"ID":"92423d7b-7233-418c-b1f2-5516a6c7c2a3","Type":"ContainerDied","Data":"e6dfeda7cc0a3bf057cc0cea2954be40af624c15ec2d2bb37805252b8249ff35"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.416932 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6dfeda7cc0a3bf057cc0cea2954be40af624c15ec2d2bb37805252b8249ff35" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.417008 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7757b" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.430440 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vvkr6-config-psjmw"] Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.432204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h4wdg" event={"ID":"5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb","Type":"ContainerDied","Data":"eb692f5b891d763d176169dc13fdba7caad2a4c87266350a135be50f0daa1f0b"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.432234 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb692f5b891d763d176169dc13fdba7caad2a4c87266350a135be50f0daa1f0b" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.432278 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h4wdg" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.437442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b0e5-account-create-update-k86pp" event={"ID":"a7fcab9a-5390-407b-afe4-753ec3be0120","Type":"ContainerDied","Data":"91fd193d46db027a3df5b4cb3ba49a8c8ec59b367881e09e1f7824ba4edac393"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.437485 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91fd193d46db027a3df5b4cb3ba49a8c8ec59b367881e09e1f7824ba4edac393" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.437535 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b0e5-account-create-update-k86pp" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.437946 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vvkr6-config-psjmw"] Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.440567 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6f3e-account-create-update-95dd4" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.440654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6f3e-account-create-update-95dd4" event={"ID":"24172997-039c-47c2-b908-087dec03273f","Type":"ContainerDied","Data":"cb79e3bfa4cdf739cf5640487e1d6569924adee2940c024ceb44b387aa7afcfc"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.440674 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb79e3bfa4cdf739cf5640487e1d6569924adee2940c024ceb44b387aa7afcfc" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.444734 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-j98v9" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.444823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-j98v9" event={"ID":"82a48402-5e5b-4427-b4f3-d28d7eb0e61e","Type":"ContainerDied","Data":"eefc1430071805dbeb5dfa91842df677db4877873e04b7e0fc6049f37b8c78ae"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.444851 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eefc1430071805dbeb5dfa91842df677db4877873e04b7e0fc6049f37b8c78ae" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.449352 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a71a-account-create-update-tqs56" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.449365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a71a-account-create-update-tqs56" event={"ID":"f0b458d9-3955-421a-bfa6-30fca174692a","Type":"ContainerDied","Data":"0e09697260f567a9b05ce73e09219655ef86eb4970c75e61c48928665d35cc17"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.449401 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e09697260f567a9b05ce73e09219655ef86eb4970c75e61c48928665d35cc17" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.455243 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="694d08ef09f6f96203b6a1aebcbc75fc83d666e428a5ccf43c854ed56dab7ce3" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.455612 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vvkr6-config-psjmw" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.469194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-336f-account-create-update-gtvgd" event={"ID":"ed7891f1-2643-4a2a-938b-80c0f25dac7c","Type":"ContainerDied","Data":"0e8535f703128786f9e7cb232a47766c83ed1b995d3f6bbf24c98157c8f00229"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.469231 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e8535f703128786f9e7cb232a47766c83ed1b995d3f6bbf24c98157c8f00229" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.469309 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-336f-account-create-update-gtvgd" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.470661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-74xmp" event={"ID":"2d232067-8a9f-4c42-a934-60560ad7d65c","Type":"ContainerDied","Data":"1aa0879c54958d0ab61362538d09d87508511feb46a406d62f24ee927e148a05"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.470694 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aa0879c54958d0ab61362538d09d87508511feb46a406d62f24ee927e148a05" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.470745 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-74xmp" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.482823 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"5678acc654e96e7cd9eebfb352386310f9dd7dda1e4fc1432cc386c88c45e4bd"} Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.573499 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:20:09 crc kubenswrapper[4707]: I1127 16:20:09.573545 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:20:10 crc kubenswrapper[4707]: I1127 16:20:10.632263 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t7wqv" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="registry-server" probeResult="failure" output=< Nov 27 16:20:10 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Nov 27 16:20:10 crc kubenswrapper[4707]: > Nov 27 16:20:11 crc kubenswrapper[4707]: I1127 16:20:11.205876 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5e84399-e321-4a51-b22f-eecd3d68b744" path="/var/lib/kubelet/pods/c5e84399-e321-4a51-b22f-eecd3d68b744/volumes" Nov 27 16:20:11 crc kubenswrapper[4707]: I1127 16:20:11.863378 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:20:14 crc kubenswrapper[4707]: I1127 16:20:14.531165 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"0daaec30214cbb3c4ffc3df8b5eeffeef77127b67cd90c4a36d25bc07b50f2c8"} Nov 27 16:20:14 crc kubenswrapper[4707]: I1127 16:20:14.531632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"99c43bd682b318b4257c1fcb741d56ff3919eafcdc0cb1305f5046e4e15b8eb8"} Nov 27 16:20:14 crc kubenswrapper[4707]: I1127 16:20:14.531643 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"bffd0d9d6e450abe1d5d6c38d6b5911db38bef94279ed9af44f30680547540ca"} Nov 27 16:20:14 crc kubenswrapper[4707]: I1127 16:20:14.532980 4707 generic.go:334] "Generic (PLEG): container finished" podID="09fb5cc3-1c55-459c-aa89-63f13e8f97f4" containerID="4ee16c740caac328bd7fff69ecf2e731609e164c15d991981d2a81d2b9d40c1c" exitCode=0 Nov 27 16:20:14 crc kubenswrapper[4707]: I1127 16:20:14.533051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kdv4k" event={"ID":"09fb5cc3-1c55-459c-aa89-63f13e8f97f4","Type":"ContainerDied","Data":"4ee16c740caac328bd7fff69ecf2e731609e164c15d991981d2a81d2b9d40c1c"} Nov 27 16:20:14 crc kubenswrapper[4707]: I1127 16:20:14.535224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ktrrp" event={"ID":"a6c5bb46-81d2-4478-8a44-42c46ddaaffa","Type":"ContainerStarted","Data":"4abe399b9e6ebf2086362330961b71743ba3799fce812e482717eaa3ffa4e1b2"} Nov 27 16:20:14 crc kubenswrapper[4707]: I1127 16:20:14.566948 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ktrrp" podStartSLOduration=5.018453389 podStartE2EDuration="12.566932287s" podCreationTimestamp="2025-11-27 16:20:02 +0000 UTC" firstStartedPulling="2025-11-27 16:20:06.256133986 +0000 UTC m=+981.887582754" lastFinishedPulling="2025-11-27 16:20:13.804612884 +0000 UTC m=+989.436061652" observedRunningTime="2025-11-27 16:20:14.562068819 +0000 UTC m=+990.193517587" watchObservedRunningTime="2025-11-27 16:20:14.566932287 +0000 UTC m=+990.198381055" Nov 27 16:20:15 crc kubenswrapper[4707]: I1127 16:20:15.563680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"71620d60aad4b08a6dd57464fbb7aa8a60d1712831c3a674cfbbc655954ad9db"} Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.309529 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kdv4k" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.442566 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-config-data\") pod \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.442637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-combined-ca-bundle\") pod \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.442713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzznc\" (UniqueName: \"kubernetes.io/projected/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-kube-api-access-nzznc\") pod \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.442751 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-db-sync-config-data\") pod \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\" (UID: \"09fb5cc3-1c55-459c-aa89-63f13e8f97f4\") " Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.450490 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "09fb5cc3-1c55-459c-aa89-63f13e8f97f4" (UID: "09fb5cc3-1c55-459c-aa89-63f13e8f97f4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.450609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-kube-api-access-nzznc" (OuterVolumeSpecName: "kube-api-access-nzznc") pod "09fb5cc3-1c55-459c-aa89-63f13e8f97f4" (UID: "09fb5cc3-1c55-459c-aa89-63f13e8f97f4"). InnerVolumeSpecName "kube-api-access-nzznc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.475382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09fb5cc3-1c55-459c-aa89-63f13e8f97f4" (UID: "09fb5cc3-1c55-459c-aa89-63f13e8f97f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.497628 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-config-data" (OuterVolumeSpecName: "config-data") pod "09fb5cc3-1c55-459c-aa89-63f13e8f97f4" (UID: "09fb5cc3-1c55-459c-aa89-63f13e8f97f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.544325 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.544349 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.544359 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzznc\" (UniqueName: \"kubernetes.io/projected/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-kube-api-access-nzznc\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.544383 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09fb5cc3-1c55-459c-aa89-63f13e8f97f4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.591232 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"c48bf6714877ec2d76642118c45d3c7d28258b5106af839e27251a3bffacf92a"} Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.595033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kdv4k" event={"ID":"09fb5cc3-1c55-459c-aa89-63f13e8f97f4","Type":"ContainerDied","Data":"a90b31025fffaab4aef0e5edd894d0033035c4ce7bf7c5cfea449e8bf94d16a3"} Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.595057 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90b31025fffaab4aef0e5edd894d0033035c4ce7bf7c5cfea449e8bf94d16a3" Nov 27 16:20:16 crc kubenswrapper[4707]: I1127 16:20:16.595102 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kdv4k" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.008705 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-b9t7n"] Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009337 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5e84399-e321-4a51-b22f-eecd3d68b744" containerName="ovn-config" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009354 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5e84399-e321-4a51-b22f-eecd3d68b744" containerName="ovn-config" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009436 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerName="registry-server" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009444 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerName="registry-server" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009460 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b458d9-3955-421a-bfa6-30fca174692a" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009468 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b458d9-3955-421a-bfa6-30fca174692a" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009476 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerName="extract-content" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009481 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerName="extract-content" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009489 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09fb5cc3-1c55-459c-aa89-63f13e8f97f4" containerName="glance-db-sync" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009494 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fb5cc3-1c55-459c-aa89-63f13e8f97f4" containerName="glance-db-sync" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009504 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed7891f1-2643-4a2a-938b-80c0f25dac7c" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009511 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7891f1-2643-4a2a-938b-80c0f25dac7c" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009525 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24172997-039c-47c2-b908-087dec03273f" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009531 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="24172997-039c-47c2-b908-087dec03273f" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009538 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerName="extract-utilities" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009545 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerName="extract-utilities" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009555 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fcab9a-5390-407b-afe4-753ec3be0120" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009561 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fcab9a-5390-407b-afe4-753ec3be0120" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009571 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d232067-8a9f-4c42-a934-60560ad7d65c" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009577 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d232067-8a9f-4c42-a934-60560ad7d65c" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009588 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009594 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009606 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a48402-5e5b-4427-b4f3-d28d7eb0e61e" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009612 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a48402-5e5b-4427-b4f3-d28d7eb0e61e" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: E1127 16:20:17.009622 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92423d7b-7233-418c-b1f2-5516a6c7c2a3" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009628 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92423d7b-7233-418c-b1f2-5516a6c7c2a3" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009773 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5e84399-e321-4a51-b22f-eecd3d68b744" containerName="ovn-config" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009792 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="24172997-039c-47c2-b908-087dec03273f" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009802 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009812 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed7891f1-2643-4a2a-938b-80c0f25dac7c" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009822 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c9f54d-8957-42f2-9b0a-24b4cc72a2a7" containerName="registry-server" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009829 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92423d7b-7233-418c-b1f2-5516a6c7c2a3" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009836 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b458d9-3955-421a-bfa6-30fca174692a" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009843 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="09fb5cc3-1c55-459c-aa89-63f13e8f97f4" containerName="glance-db-sync" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009854 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fcab9a-5390-407b-afe4-753ec3be0120" containerName="mariadb-account-create-update" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a48402-5e5b-4427-b4f3-d28d7eb0e61e" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.009874 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d232067-8a9f-4c42-a934-60560ad7d65c" containerName="mariadb-database-create" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.010736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.034607 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-b9t7n"] Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.055729 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-config\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.055799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.055831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.055849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7kr7\" (UniqueName: \"kubernetes.io/projected/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-kube-api-access-t7kr7\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.055875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.157046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.157115 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.157151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7kr7\" (UniqueName: \"kubernetes.io/projected/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-kube-api-access-t7kr7\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.157189 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.157273 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-config\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.158673 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.158903 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.158984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-config\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.159312 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.172049 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7kr7\" (UniqueName: \"kubernetes.io/projected/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-kube-api-access-t7kr7\") pod \"dnsmasq-dns-5b946c75cc-b9t7n\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.332979 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.608850 4707 generic.go:334] "Generic (PLEG): container finished" podID="a6c5bb46-81d2-4478-8a44-42c46ddaaffa" containerID="4abe399b9e6ebf2086362330961b71743ba3799fce812e482717eaa3ffa4e1b2" exitCode=0 Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.609260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ktrrp" event={"ID":"a6c5bb46-81d2-4478-8a44-42c46ddaaffa","Type":"ContainerDied","Data":"4abe399b9e6ebf2086362330961b71743ba3799fce812e482717eaa3ffa4e1b2"} Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.635553 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"231354df167c66bde816e511b4e401d434b5677339fb299ebab2e0c03d083579"} Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.635594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"57ff0f13c3db214868661c3e31a18df6756e9ffdeba67a8b112b8b355edda8bd"} Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.635606 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"7de44615199151b0fc92906c4aeb2c1c375a862c6f6c20e683c271e3f268761c"} Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.635614 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"3847640f864b0fd6770d9bc9b850268aef9803654237a3d9c5b846d181b71447"} Nov 27 16:20:17 crc kubenswrapper[4707]: I1127 16:20:17.797063 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-b9t7n"] Nov 27 16:20:18 crc kubenswrapper[4707]: I1127 16:20:18.671479 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"cd1d76308bd7552939730a562a89745b408cb3b15d7bf1d735ec3f5e885d53c6"} Nov 27 16:20:18 crc kubenswrapper[4707]: I1127 16:20:18.671848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d67e130-f1e2-4fe8-9647-8725402a1cdd","Type":"ContainerStarted","Data":"b94be8bb48a60fdff2cd14bb3180cc0372dc80f616ea0a2c1f73bb9a1ecf58c2"} Nov 27 16:20:18 crc kubenswrapper[4707]: I1127 16:20:18.674883 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" containerID="47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3" exitCode=0 Nov 27 16:20:18 crc kubenswrapper[4707]: I1127 16:20:18.674964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" event={"ID":"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731","Type":"ContainerDied","Data":"47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3"} Nov 27 16:20:18 crc kubenswrapper[4707]: I1127 16:20:18.675011 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" event={"ID":"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731","Type":"ContainerStarted","Data":"1a8b12b23932df4458f8ddd825a8655127aedd0e2fd2bd0838922c52f662c124"} Nov 27 16:20:18 crc kubenswrapper[4707]: I1127 16:20:18.720013 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=41.173162861 podStartE2EDuration="51.719994791s" podCreationTimestamp="2025-11-27 16:19:27 +0000 UTC" firstStartedPulling="2025-11-27 16:20:05.716168909 +0000 UTC m=+981.347617677" lastFinishedPulling="2025-11-27 16:20:16.263000839 +0000 UTC m=+991.894449607" observedRunningTime="2025-11-27 16:20:18.715285896 +0000 UTC m=+994.346734684" watchObservedRunningTime="2025-11-27 16:20:18.719994791 +0000 UTC m=+994.351443569" Nov 27 16:20:18 crc kubenswrapper[4707]: I1127 16:20:18.959752 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.019343 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-b9t7n"] Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.048710 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-vfk4s"] Nov 27 16:20:19 crc kubenswrapper[4707]: E1127 16:20:19.049014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c5bb46-81d2-4478-8a44-42c46ddaaffa" containerName="keystone-db-sync" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.049031 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c5bb46-81d2-4478-8a44-42c46ddaaffa" containerName="keystone-db-sync" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.049206 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c5bb46-81d2-4478-8a44-42c46ddaaffa" containerName="keystone-db-sync" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.050229 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.052969 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.062665 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-vfk4s"] Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.096716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzjhb\" (UniqueName: \"kubernetes.io/projected/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-kube-api-access-kzjhb\") pod \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.096874 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-config-data\") pod \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.096898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-combined-ca-bundle\") pod \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\" (UID: \"a6c5bb46-81d2-4478-8a44-42c46ddaaffa\") " Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.100934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-kube-api-access-kzjhb" (OuterVolumeSpecName: "kube-api-access-kzjhb") pod "a6c5bb46-81d2-4478-8a44-42c46ddaaffa" (UID: "a6c5bb46-81d2-4478-8a44-42c46ddaaffa"). InnerVolumeSpecName "kube-api-access-kzjhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.120282 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6c5bb46-81d2-4478-8a44-42c46ddaaffa" (UID: "a6c5bb46-81d2-4478-8a44-42c46ddaaffa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.139572 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-config-data" (OuterVolumeSpecName: "config-data") pod "a6c5bb46-81d2-4478-8a44-42c46ddaaffa" (UID: "a6c5bb46-81d2-4478-8a44-42c46ddaaffa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.198346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.198441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.198481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-config\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.198648 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.198696 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.198856 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvs9v\" (UniqueName: \"kubernetes.io/projected/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-kube-api-access-lvs9v\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.198985 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.199001 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.199014 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzjhb\" (UniqueName: \"kubernetes.io/projected/a6c5bb46-81d2-4478-8a44-42c46ddaaffa-kube-api-access-kzjhb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.301085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.301203 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-config\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.302223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.302227 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.302304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.302331 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-config\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.302489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvs9v\" (UniqueName: \"kubernetes.io/projected/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-kube-api-access-lvs9v\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.302575 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.303428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.303446 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.303544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.328181 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvs9v\" (UniqueName: \"kubernetes.io/projected/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-kube-api-access-lvs9v\") pod \"dnsmasq-dns-74f6bcbc87-vfk4s\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.364783 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.647446 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.691506 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ktrrp" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.691789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ktrrp" event={"ID":"a6c5bb46-81d2-4478-8a44-42c46ddaaffa","Type":"ContainerDied","Data":"ba2ef2ff00eec2c5a5232895e19e0e65cb090969deeec0636244f305897b6049"} Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.691842 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba2ef2ff00eec2c5a5232895e19e0e65cb090969deeec0636244f305897b6049" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.696542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" event={"ID":"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731","Type":"ContainerStarted","Data":"c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a"} Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.696757 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.711316 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.727190 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" podStartSLOduration=3.727173536 podStartE2EDuration="3.727173536s" podCreationTimestamp="2025-11-27 16:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:19.726542971 +0000 UTC m=+995.357991749" watchObservedRunningTime="2025-11-27 16:20:19.727173536 +0000 UTC m=+995.358622304" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.868049 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-vfk4s"] Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.917515 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-vfk4s"] Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.946424 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t7wqv"] Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.957627 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntngj"] Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.960157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.964691 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntngj"] Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.973005 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x8n5z"] Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.974059 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.976727 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.976874 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.977063 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.977160 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-drvd4" Nov 27 16:20:19 crc kubenswrapper[4707]: I1127 16:20:19.977305 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.027413 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x8n5z"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.052759 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-5k4zv"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.053785 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.055473 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.055736 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-s8xhq" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.070050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-5k4zv"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-scripts\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddm5r\" (UniqueName: \"kubernetes.io/projected/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-kube-api-access-ddm5r\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126491 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-fernet-keys\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126566 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-credential-keys\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126661 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-config-data\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwz72\" (UniqueName: \"kubernetes.io/projected/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-kube-api-access-vwz72\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-config\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126740 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.126777 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-combined-ca-bundle\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.144009 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gb8rd"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.145323 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.149212 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.149483 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.149871 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-89cbx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.156494 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gb8rd"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.211603 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.213642 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.216092 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.216301 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.228493 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229147 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-scripts\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229226 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-combined-ca-bundle\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5j7f\" (UniqueName: \"kubernetes.io/projected/1c445b1d-7e63-48cc-83ee-c4841074701c-kube-api-access-p5j7f\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-etc-machine-id\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-db-sync-config-data\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229333 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-config-data\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-scripts\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229383 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddm5r\" (UniqueName: \"kubernetes.io/projected/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-kube-api-access-ddm5r\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-fernet-keys\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229467 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-credential-keys\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229497 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts9jh\" (UniqueName: \"kubernetes.io/projected/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-kube-api-access-ts9jh\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229535 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229554 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-config-data\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwz72\" (UniqueName: \"kubernetes.io/projected/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-kube-api-access-vwz72\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-config-data\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-combined-ca-bundle\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-config\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.229649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-combined-ca-bundle\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.231893 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.232974 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.235044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-config-data\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.235639 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-svc\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.236325 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-fernet-keys\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.236706 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-combined-ca-bundle\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.236954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.238751 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-config\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.259979 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddm5r\" (UniqueName: \"kubernetes.io/projected/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-kube-api-access-ddm5r\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.260045 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ftcdx"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.262037 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwz72\" (UniqueName: \"kubernetes.io/projected/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-kube-api-access-vwz72\") pod \"dnsmasq-dns-847c4cc679-ntngj\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.262297 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-credential-keys\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.266519 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.268778 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.269139 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.269343 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qc5rx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.276521 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ftcdx"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.287082 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-scripts\") pod \"keystone-bootstrap-x8n5z\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.320481 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-qzltn"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.321592 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.328954 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.329168 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gcgkn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330550 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330574 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-log-httpd\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330593 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330624 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts9jh\" (UniqueName: \"kubernetes.io/projected/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-kube-api-access-ts9jh\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-config-data\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330673 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spsnj\" (UniqueName: \"kubernetes.io/projected/a49cd9fc-1364-454e-af11-bbf64e43e56d-kube-api-access-spsnj\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330694 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-combined-ca-bundle\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-combined-ca-bundle\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-run-httpd\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-scripts\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330776 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5j7f\" (UniqueName: \"kubernetes.io/projected/1c445b1d-7e63-48cc-83ee-c4841074701c-kube-api-access-p5j7f\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330804 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-etc-machine-id\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-db-sync-config-data\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-config-data\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-scripts\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.330899 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-config-data\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.331594 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-etc-machine-id\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.338114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-config-data\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.338185 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-db-sync-config-data\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.347254 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-config-data\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.350180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-combined-ca-bundle\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.350773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-combined-ca-bundle\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.350975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-scripts\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.359483 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qzltn"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.360625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5j7f\" (UniqueName: \"kubernetes.io/projected/1c445b1d-7e63-48cc-83ee-c4841074701c-kube-api-access-p5j7f\") pod \"heat-db-sync-5k4zv\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.365059 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hmbzj"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.374737 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.384481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts9jh\" (UniqueName: \"kubernetes.io/projected/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-kube-api-access-ts9jh\") pod \"cinder-db-sync-gb8rd\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.384970 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-vw7rm" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.385161 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.385261 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.386066 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.390731 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.411168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5k4zv" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.435669 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-config\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436001 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dnn\" (UniqueName: \"kubernetes.io/projected/525c317f-60e5-4359-bdd4-62caf9f54b38-kube-api-access-x2dnn\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-scripts\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436090 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-config-data\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436170 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-combined-ca-bundle\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436195 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436216 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq749\" (UniqueName: \"kubernetes.io/projected/ffe6108f-9182-4d5b-b877-977da419fc7c-kube-api-access-tq749\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436240 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-log-httpd\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436259 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436323 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spsnj\" (UniqueName: \"kubernetes.io/projected/a49cd9fc-1364-454e-af11-bbf64e43e56d-kube-api-access-spsnj\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.436359 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-run-httpd\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.458335 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-combined-ca-bundle\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.458407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-db-sync-config-data\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.445466 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hmbzj"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.446334 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.446461 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.443225 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-run-httpd\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.443524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-log-httpd\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.461963 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-config-data\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.467601 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-scripts\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.476616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spsnj\" (UniqueName: \"kubernetes.io/projected/a49cd9fc-1364-454e-af11-bbf64e43e56d-kube-api-access-spsnj\") pod \"ceilometer-0\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.487344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.493493 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntngj"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.521232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.534717 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cbftg"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.537159 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.550167 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cbftg"] Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-combined-ca-bundle\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576781 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82f4b\" (UniqueName: \"kubernetes.io/projected/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-kube-api-access-82f4b\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576805 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq749\" (UniqueName: \"kubernetes.io/projected/ffe6108f-9182-4d5b-b877-977da419fc7c-kube-api-access-tq749\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-combined-ca-bundle\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-config-data\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576903 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-db-sync-config-data\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576938 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-config\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-logs\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.576990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dnn\" (UniqueName: \"kubernetes.io/projected/525c317f-60e5-4359-bdd4-62caf9f54b38-kube-api-access-x2dnn\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.577022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-combined-ca-bundle\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.577037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-scripts\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.584984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-config\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.598032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-combined-ca-bundle\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.598039 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-combined-ca-bundle\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.601144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-db-sync-config-data\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.646076 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq749\" (UniqueName: \"kubernetes.io/projected/ffe6108f-9182-4d5b-b877-977da419fc7c-kube-api-access-tq749\") pod \"neutron-db-sync-ftcdx\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.647860 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dnn\" (UniqueName: \"kubernetes.io/projected/525c317f-60e5-4359-bdd4-62caf9f54b38-kube-api-access-x2dnn\") pod \"barbican-db-sync-qzltn\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.679998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-config-data\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680117 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2hd7\" (UniqueName: \"kubernetes.io/projected/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-kube-api-access-t2hd7\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680151 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-config\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680167 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680185 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-logs\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-combined-ca-bundle\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680328 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-scripts\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.680396 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82f4b\" (UniqueName: \"kubernetes.io/projected/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-kube-api-access-82f4b\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.681154 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-logs\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.704491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-scripts\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.714509 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-config-data\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.724861 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82f4b\" (UniqueName: \"kubernetes.io/projected/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-kube-api-access-82f4b\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.744982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-combined-ca-bundle\") pod \"placement-db-sync-hmbzj\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.745899 4707 generic.go:334] "Generic (PLEG): container finished" podID="c87e3212-fa52-4ef1-b9b0-6f4f3819da16" containerID="4df8ec7964dd306d4e19347fe32402348629055010cc9ed0f67309c584c11a0f" exitCode=0 Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.746128 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" podUID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" containerName="dnsmasq-dns" containerID="cri-o://c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a" gracePeriod=10 Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.746478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" event={"ID":"c87e3212-fa52-4ef1-b9b0-6f4f3819da16","Type":"ContainerDied","Data":"4df8ec7964dd306d4e19347fe32402348629055010cc9ed0f67309c584c11a0f"} Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.746506 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" event={"ID":"c87e3212-fa52-4ef1-b9b0-6f4f3819da16","Type":"ContainerStarted","Data":"daab5021e42de8fd5916f6dc15509c89e39b4f5ca7a0898380e3bbde3a92cf7f"} Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.746611 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t7wqv" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="registry-server" containerID="cri-o://b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2" gracePeriod=2 Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.781479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.781546 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2hd7\" (UniqueName: \"kubernetes.io/projected/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-kube-api-access-t2hd7\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.781563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.781599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-config\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.781628 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.781645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.782635 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.782884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.783275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-config\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.783583 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.783606 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: E1127 16:20:20.839076 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc87e3212_fa52_4ef1_b9b0_6f4f3819da16.slice/crio-conmon-4df8ec7964dd306d4e19347fe32402348629055010cc9ed0f67309c584c11a0f.scope\": RecentStats: unable to find data in memory cache]" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.847128 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.849857 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2hd7\" (UniqueName: \"kubernetes.io/projected/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-kube-api-access-t2hd7\") pod \"dnsmasq-dns-785d8bcb8c-cbftg\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.895208 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:20 crc kubenswrapper[4707]: I1127 16:20:20.977940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hmbzj" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.077189 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.078999 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.089319 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.089552 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.089673 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.091587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wtmvr" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.099157 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.100201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkmx8\" (UniqueName: \"kubernetes.io/projected/16d52263-b283-45be-9cb8-c3e921dba158-kube-api-access-tkmx8\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.100252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-logs\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.100286 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.100303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.100320 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.100359 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.100403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.100447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.125327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.157478 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.159231 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.164407 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.165497 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.165721 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.201797 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.201870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkmx8\" (UniqueName: \"kubernetes.io/projected/16d52263-b283-45be-9cb8-c3e921dba158-kube-api-access-tkmx8\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.201930 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-logs\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.201968 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.201987 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.202024 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.202059 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.202119 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.202648 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.215991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.216255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.216910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-logs\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.222633 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.226642 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.229785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.242656 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkmx8\" (UniqueName: \"kubernetes.io/projected/16d52263-b283-45be-9cb8-c3e921dba158-kube-api-access-tkmx8\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.270223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.303808 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wzl2\" (UniqueName: \"kubernetes.io/projected/e032a3b3-8bee-42e1-a645-17b1151c85d1-kube-api-access-7wzl2\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.304110 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.304252 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-logs\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.304303 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.304435 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.304569 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.304591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.304618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.395181 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntngj"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.405859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.405896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.405916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.406008 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wzl2\" (UniqueName: \"kubernetes.io/projected/e032a3b3-8bee-42e1-a645-17b1151c85d1-kube-api-access-7wzl2\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.406031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.406062 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-logs\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.406085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.406106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.406349 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.407022 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-logs\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.407247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.409713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.413451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.424806 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.434259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wzl2\" (UniqueName: \"kubernetes.io/projected/e032a3b3-8bee-42e1-a645-17b1151c85d1-kube-api-access-7wzl2\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.436162 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-5k4zv"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.440847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.443560 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.450687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.486311 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.567233 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.587070 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.592035 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x8n5z"] Nov 27 16:20:21 crc kubenswrapper[4707]: W1127 16:20:21.600954 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06bc7709_2c0c_4b87_8e4f_11e2e1a1726c.slice/crio-e43132b4e965544d73e344ec669d3ba2607b5ed33d165e79f338e1fb98976dba WatchSource:0}: Error finding container e43132b4e965544d73e344ec669d3ba2607b5ed33d165e79f338e1fb98976dba: Status 404 returned error can't find the container with id e43132b4e965544d73e344ec669d3ba2607b5ed33d165e79f338e1fb98976dba Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.609283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbl22\" (UniqueName: \"kubernetes.io/projected/79a68d54-eb44-474e-9c5f-1c2283a6d410-kube-api-access-rbl22\") pod \"79a68d54-eb44-474e-9c5f-1c2283a6d410\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.609587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-utilities\") pod \"79a68d54-eb44-474e-9c5f-1c2283a6d410\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.615816 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-utilities" (OuterVolumeSpecName: "utilities") pod "79a68d54-eb44-474e-9c5f-1c2283a6d410" (UID: "79a68d54-eb44-474e-9c5f-1c2283a6d410"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.621749 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a68d54-eb44-474e-9c5f-1c2283a6d410-kube-api-access-rbl22" (OuterVolumeSpecName: "kube-api-access-rbl22") pod "79a68d54-eb44-474e-9c5f-1c2283a6d410" (UID: "79a68d54-eb44-474e-9c5f-1c2283a6d410"). InnerVolumeSpecName "kube-api-access-rbl22". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.622037 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712459 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7kr7\" (UniqueName: \"kubernetes.io/projected/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-kube-api-access-t7kr7\") pod \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712512 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-catalog-content\") pod \"79a68d54-eb44-474e-9c5f-1c2283a6d410\" (UID: \"79a68d54-eb44-474e-9c5f-1c2283a6d410\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-sb\") pod \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-nb\") pod \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712687 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-svc\") pod \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712722 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-sb\") pod \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712739 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvs9v\" (UniqueName: \"kubernetes.io/projected/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-kube-api-access-lvs9v\") pod \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712783 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-config\") pod \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712808 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-dns-svc\") pod \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712827 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-config\") pod \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712844 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-nb\") pod \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\" (UID: \"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.712874 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-swift-storage-0\") pod \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\" (UID: \"c87e3212-fa52-4ef1-b9b0-6f4f3819da16\") " Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.713177 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.713188 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbl22\" (UniqueName: \"kubernetes.io/projected/79a68d54-eb44-474e-9c5f-1c2283a6d410-kube-api-access-rbl22\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.744974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-kube-api-access-lvs9v" (OuterVolumeSpecName: "kube-api-access-lvs9v") pod "c87e3212-fa52-4ef1-b9b0-6f4f3819da16" (UID: "c87e3212-fa52-4ef1-b9b0-6f4f3819da16"). InnerVolumeSpecName "kube-api-access-lvs9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.766393 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-kube-api-access-t7kr7" (OuterVolumeSpecName: "kube-api-access-t7kr7") pod "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" (UID: "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731"). InnerVolumeSpecName "kube-api-access-t7kr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.776295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" event={"ID":"c87e3212-fa52-4ef1-b9b0-6f4f3819da16","Type":"ContainerDied","Data":"daab5021e42de8fd5916f6dc15509c89e39b4f5ca7a0898380e3bbde3a92cf7f"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.776349 4707 scope.go:117] "RemoveContainer" containerID="4df8ec7964dd306d4e19347fe32402348629055010cc9ed0f67309c584c11a0f" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.776501 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-vfk4s" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.779731 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c87e3212-fa52-4ef1-b9b0-6f4f3819da16" (UID: "c87e3212-fa52-4ef1-b9b0-6f4f3819da16"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.781496 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c87e3212-fa52-4ef1-b9b0-6f4f3819da16" (UID: "c87e3212-fa52-4ef1-b9b0-6f4f3819da16"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.802411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5k4zv" event={"ID":"1c445b1d-7e63-48cc-83ee-c4841074701c","Type":"ContainerStarted","Data":"093212c8f9087466fb96d6056d794b2a38801f2e8dabba563d000a26d1beec2b"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.805011 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gb8rd"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.818813 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7kr7\" (UniqueName: \"kubernetes.io/projected/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-kube-api-access-t7kr7\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.818838 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.818847 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvs9v\" (UniqueName: \"kubernetes.io/projected/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-kube-api-access-lvs9v\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.818857 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.826882 4707 generic.go:334] "Generic (PLEG): container finished" podID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerID="b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2" exitCode=0 Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.827071 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7wqv" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.828145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wqv" event={"ID":"79a68d54-eb44-474e-9c5f-1c2283a6d410","Type":"ContainerDied","Data":"b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.828268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wqv" event={"ID":"79a68d54-eb44-474e-9c5f-1c2283a6d410","Type":"ContainerDied","Data":"d250762f4c9305264cbd0bacb82d5c46ef04492b886e79ea4b850484d3090b47"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.830599 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.838106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c87e3212-fa52-4ef1-b9b0-6f4f3819da16" (UID: "c87e3212-fa52-4ef1-b9b0-6f4f3819da16"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.848558 4707 generic.go:334] "Generic (PLEG): container finished" podID="a90b9ea4-6cc9-48e1-b89c-bfa648577c70" containerID="20582c685e946e5e15d651ac03aaefbafe1b7e3b28a1e9d07c4357546053706d" exitCode=0 Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.848635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ntngj" event={"ID":"a90b9ea4-6cc9-48e1-b89c-bfa648577c70","Type":"ContainerDied","Data":"20582c685e946e5e15d651ac03aaefbafe1b7e3b28a1e9d07c4357546053706d"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.848661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ntngj" event={"ID":"a90b9ea4-6cc9-48e1-b89c-bfa648577c70","Type":"ContainerStarted","Data":"407b4c5bc0fcc0181c897cf7b68c735ed65c35473fcdee2d73ca299771621929"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.851306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-qzltn"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.853307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x8n5z" event={"ID":"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c","Type":"ContainerStarted","Data":"e43132b4e965544d73e344ec669d3ba2607b5ed33d165e79f338e1fb98976dba"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.869712 4707 generic.go:334] "Generic (PLEG): container finished" podID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" containerID="c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a" exitCode=0 Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.869755 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" event={"ID":"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731","Type":"ContainerDied","Data":"c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.869782 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" event={"ID":"fd4a36f0-fb55-45b1-a4d0-fd45fd3de731","Type":"ContainerDied","Data":"1a8b12b23932df4458f8ddd825a8655127aedd0e2fd2bd0838922c52f662c124"} Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.869839 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-b9t7n" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.879595 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hmbzj"] Nov 27 16:20:21 crc kubenswrapper[4707]: W1127 16:20:21.896677 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod525c317f_60e5_4359_bdd4_62caf9f54b38.slice/crio-440d28a5d4b638c2070005a98e116276f229010931d81c83ebb36f02a46d5f81 WatchSource:0}: Error finding container 440d28a5d4b638c2070005a98e116276f229010931d81c83ebb36f02a46d5f81: Status 404 returned error can't find the container with id 440d28a5d4b638c2070005a98e116276f229010931d81c83ebb36f02a46d5f81 Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.898721 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ftcdx"] Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.911341 4707 scope.go:117] "RemoveContainer" containerID="b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.912102 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" (UID: "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.914441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" (UID: "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.920561 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.920579 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.920607 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.940005 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79a68d54-eb44-474e-9c5f-1c2283a6d410" (UID: "79a68d54-eb44-474e-9c5f-1c2283a6d410"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.954388 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c87e3212-fa52-4ef1-b9b0-6f4f3819da16" (UID: "c87e3212-fa52-4ef1-b9b0-6f4f3819da16"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.954678 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-config" (OuterVolumeSpecName: "config") pod "c87e3212-fa52-4ef1-b9b0-6f4f3819da16" (UID: "c87e3212-fa52-4ef1-b9b0-6f4f3819da16"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.964268 4707 scope.go:117] "RemoveContainer" containerID="04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343" Nov 27 16:20:21 crc kubenswrapper[4707]: I1127 16:20:21.989231 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cbftg"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.003511 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" (UID: "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.018667 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-config" (OuterVolumeSpecName: "config") pod "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" (UID: "fd4a36f0-fb55-45b1-a4d0-fd45fd3de731"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.021645 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.021674 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.021682 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.021690 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87e3212-fa52-4ef1-b9b0-6f4f3819da16-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.021700 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a68d54-eb44-474e-9c5f-1c2283a6d410-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.039340 4707 scope.go:117] "RemoveContainer" containerID="23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.092685 4707 scope.go:117] "RemoveContainer" containerID="b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2" Nov 27 16:20:22 crc kubenswrapper[4707]: E1127 16:20:22.093691 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2\": container with ID starting with b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2 not found: ID does not exist" containerID="b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.093789 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2"} err="failed to get container status \"b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2\": rpc error: code = NotFound desc = could not find container \"b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2\": container with ID starting with b3f1f00775fcc967ec5968ba887a84b5220ba498fa01b89e91df02c37b036ec2 not found: ID does not exist" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.093866 4707 scope.go:117] "RemoveContainer" containerID="04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343" Nov 27 16:20:22 crc kubenswrapper[4707]: E1127 16:20:22.094549 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343\": container with ID starting with 04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343 not found: ID does not exist" containerID="04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.094652 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343"} err="failed to get container status \"04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343\": rpc error: code = NotFound desc = could not find container \"04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343\": container with ID starting with 04e82ab673f3cde8a4a330c0d929bd2a93337465221967c34bece03584562343 not found: ID does not exist" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.094725 4707 scope.go:117] "RemoveContainer" containerID="23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164" Nov 27 16:20:22 crc kubenswrapper[4707]: E1127 16:20:22.095079 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164\": container with ID starting with 23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164 not found: ID does not exist" containerID="23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.095120 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164"} err="failed to get container status \"23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164\": rpc error: code = NotFound desc = could not find container \"23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164\": container with ID starting with 23fbd7e5fba10d6f58e7a1cfbe5368c5020340b14aee0b7f90f255c0f0d7e164 not found: ID does not exist" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.095147 4707 scope.go:117] "RemoveContainer" containerID="c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.162314 4707 scope.go:117] "RemoveContainer" containerID="47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.167265 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-vfk4s"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.177924 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-vfk4s"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.188267 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.232731 4707 scope.go:117] "RemoveContainer" containerID="c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a" Nov 27 16:20:22 crc kubenswrapper[4707]: E1127 16:20:22.234105 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a\": container with ID starting with c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a not found: ID does not exist" containerID="c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.234156 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a"} err="failed to get container status \"c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a\": rpc error: code = NotFound desc = could not find container \"c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a\": container with ID starting with c1f304ef8d212425af3cd04fe0a9d34bc617248b79ce4237464f406c73d90b4a not found: ID does not exist" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.234180 4707 scope.go:117] "RemoveContainer" containerID="47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3" Nov 27 16:20:22 crc kubenswrapper[4707]: E1127 16:20:22.235071 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3\": container with ID starting with 47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3 not found: ID does not exist" containerID="47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.235125 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3"} err="failed to get container status \"47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3\": rpc error: code = NotFound desc = could not find container \"47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3\": container with ID starting with 47d0da891837cb78986aba7318980eddd5df284106643499b77ff5c7f29fc7e3 not found: ID does not exist" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.373789 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t7wqv"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.389422 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.406431 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t7wqv"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.417437 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-b9t7n"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.424764 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-b9t7n"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.543506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-nb\") pod \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.543561 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-config\") pod \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.543737 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwz72\" (UniqueName: \"kubernetes.io/projected/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-kube-api-access-vwz72\") pod \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.543773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-swift-storage-0\") pod \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.543807 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-svc\") pod \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.544227 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-sb\") pod \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\" (UID: \"a90b9ea4-6cc9-48e1-b89c-bfa648577c70\") " Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.552205 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-kube-api-access-vwz72" (OuterVolumeSpecName: "kube-api-access-vwz72") pod "a90b9ea4-6cc9-48e1-b89c-bfa648577c70" (UID: "a90b9ea4-6cc9-48e1-b89c-bfa648577c70"). InnerVolumeSpecName "kube-api-access-vwz72". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.566646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a90b9ea4-6cc9-48e1-b89c-bfa648577c70" (UID: "a90b9ea4-6cc9-48e1-b89c-bfa648577c70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.568815 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a90b9ea4-6cc9-48e1-b89c-bfa648577c70" (UID: "a90b9ea4-6cc9-48e1-b89c-bfa648577c70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.602886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-config" (OuterVolumeSpecName: "config") pod "a90b9ea4-6cc9-48e1-b89c-bfa648577c70" (UID: "a90b9ea4-6cc9-48e1-b89c-bfa648577c70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.605006 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a90b9ea4-6cc9-48e1-b89c-bfa648577c70" (UID: "a90b9ea4-6cc9-48e1-b89c-bfa648577c70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.610166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a90b9ea4-6cc9-48e1-b89c-bfa648577c70" (UID: "a90b9ea4-6cc9-48e1-b89c-bfa648577c70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.646271 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.646340 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.646355 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.646382 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.646395 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.646404 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwz72\" (UniqueName: \"kubernetes.io/projected/a90b9ea4-6cc9-48e1-b89c-bfa648577c70-kube-api-access-vwz72\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.708060 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.767051 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.787732 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.908418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ftcdx" event={"ID":"ffe6108f-9182-4d5b-b877-977da419fc7c","Type":"ContainerStarted","Data":"4cebbc21162ddd05ea2b220c368d5661cfb0b3e37d8da5c199a0c260c2bd13f1"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.908463 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ftcdx" event={"ID":"ffe6108f-9182-4d5b-b877-977da419fc7c","Type":"ContainerStarted","Data":"0f86c145fdacbecaa283ec4e928ba252fb7bf494456439ef725921ab23b36402"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.926973 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e032a3b3-8bee-42e1-a645-17b1151c85d1","Type":"ContainerStarted","Data":"4901ebd5e0f1778a95e5a43e91cf0edcb633327fb5dfa7eff1c4e4a8c2ce611f"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.933201 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qzltn" event={"ID":"525c317f-60e5-4359-bdd4-62caf9f54b38","Type":"ContainerStarted","Data":"440d28a5d4b638c2070005a98e116276f229010931d81c83ebb36f02a46d5f81"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.941153 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ftcdx" podStartSLOduration=2.940040903 podStartE2EDuration="2.940040903s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:22.933466023 +0000 UTC m=+998.564914791" watchObservedRunningTime="2025-11-27 16:20:22.940040903 +0000 UTC m=+998.571489671" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.942248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerStarted","Data":"c05d454925458f5498885f54d66c0dac45a32892f7880cdb10728ab6946c6a89"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.951223 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-ntngj" event={"ID":"a90b9ea4-6cc9-48e1-b89c-bfa648577c70","Type":"ContainerDied","Data":"407b4c5bc0fcc0181c897cf7b68c735ed65c35473fcdee2d73ca299771621929"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.951480 4707 scope.go:117] "RemoveContainer" containerID="20582c685e946e5e15d651ac03aaefbafe1b7e3b28a1e9d07c4357546053706d" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.951486 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-ntngj" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.956980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x8n5z" event={"ID":"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c","Type":"ContainerStarted","Data":"712557d6b5ee3c01c306eeb1e653fa4c69f6c8eec74fff38fbaa551d2cf27f5c"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.969350 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gb8rd" event={"ID":"1b8ce5ef-c2af-4e15-9677-8e878b96c4de","Type":"ContainerStarted","Data":"8b3b6fef5c666be483eb6a325fc48ed52bafcabaf875b0753de1763448512eb8"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.976006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hmbzj" event={"ID":"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd","Type":"ContainerStarted","Data":"2f0f8b8b2b5b36a15b7e8a4c8da3f4e812cca11f2f3d832159c3c34b339e4bb0"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.977882 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x8n5z" podStartSLOduration=3.977724189 podStartE2EDuration="3.977724189s" podCreationTimestamp="2025-11-27 16:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:22.971580279 +0000 UTC m=+998.603029047" watchObservedRunningTime="2025-11-27 16:20:22.977724189 +0000 UTC m=+998.609172947" Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.982822 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerID="caa589fec436214639fa34128f727e3785d722dfd159296bfe4a33bbbc85191d" exitCode=0 Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.982864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" event={"ID":"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82","Type":"ContainerDied","Data":"caa589fec436214639fa34128f727e3785d722dfd159296bfe4a33bbbc85191d"} Nov 27 16:20:22 crc kubenswrapper[4707]: I1127 16:20:22.982889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" event={"ID":"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82","Type":"ContainerStarted","Data":"fdcf821ca62a141bc378b0d6287b60f1ed9d2e669ce1e27371313ac31de366be"} Nov 27 16:20:23 crc kubenswrapper[4707]: I1127 16:20:23.059873 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntngj"] Nov 27 16:20:23 crc kubenswrapper[4707]: I1127 16:20:23.073806 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-ntngj"] Nov 27 16:20:23 crc kubenswrapper[4707]: I1127 16:20:23.223635 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" path="/var/lib/kubelet/pods/79a68d54-eb44-474e-9c5f-1c2283a6d410/volumes" Nov 27 16:20:23 crc kubenswrapper[4707]: I1127 16:20:23.233503 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a90b9ea4-6cc9-48e1-b89c-bfa648577c70" path="/var/lib/kubelet/pods/a90b9ea4-6cc9-48e1-b89c-bfa648577c70/volumes" Nov 27 16:20:23 crc kubenswrapper[4707]: I1127 16:20:23.234280 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87e3212-fa52-4ef1-b9b0-6f4f3819da16" path="/var/lib/kubelet/pods/c87e3212-fa52-4ef1-b9b0-6f4f3819da16/volumes" Nov 27 16:20:23 crc kubenswrapper[4707]: I1127 16:20:23.234754 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" path="/var/lib/kubelet/pods/fd4a36f0-fb55-45b1-a4d0-fd45fd3de731/volumes" Nov 27 16:20:23 crc kubenswrapper[4707]: I1127 16:20:23.313029 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:24 crc kubenswrapper[4707]: I1127 16:20:24.009137 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" event={"ID":"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82","Type":"ContainerStarted","Data":"7d261fcd6ef13459b3a61a1ccfa03a9eb1612fb9e8ff130111490a131872ecca"} Nov 27 16:20:24 crc kubenswrapper[4707]: I1127 16:20:24.009230 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:24 crc kubenswrapper[4707]: I1127 16:20:24.013711 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d52263-b283-45be-9cb8-c3e921dba158","Type":"ContainerStarted","Data":"d626d4a978f4a2ddab13d233a163c205deee5db91bc66c7632ee34fd9f580b09"} Nov 27 16:20:24 crc kubenswrapper[4707]: I1127 16:20:24.017207 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e032a3b3-8bee-42e1-a645-17b1151c85d1","Type":"ContainerStarted","Data":"e4bd368daa5956eba42a3efa5d84ccee2790b82b04a1662b63f9b270595bd24a"} Nov 27 16:20:24 crc kubenswrapper[4707]: I1127 16:20:24.034806 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" podStartSLOduration=4.034786786 podStartE2EDuration="4.034786786s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:24.024890495 +0000 UTC m=+999.656339263" watchObservedRunningTime="2025-11-27 16:20:24.034786786 +0000 UTC m=+999.666235554" Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.031141 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e032a3b3-8bee-42e1-a645-17b1151c85d1","Type":"ContainerStarted","Data":"3e5b6e94305e2dba2a8d8db42053becfe3c1d0a386bf4f648c16cec75bff48f3"} Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.031527 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerName="glance-log" containerID="cri-o://e4bd368daa5956eba42a3efa5d84ccee2790b82b04a1662b63f9b270595bd24a" gracePeriod=30 Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.032011 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerName="glance-httpd" containerID="cri-o://3e5b6e94305e2dba2a8d8db42053becfe3c1d0a386bf4f648c16cec75bff48f3" gracePeriod=30 Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.038576 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d52263-b283-45be-9cb8-c3e921dba158","Type":"ContainerStarted","Data":"f9c410da6adcacbec16bd4ead875bc2e39f4b0a65bfe6c4191403a1e940304c9"} Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.038626 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d52263-b283-45be-9cb8-c3e921dba158","Type":"ContainerStarted","Data":"a8d0e99fc180c7ca27debe514006dca82b4074e304a1a4d99f2fe79de29667a5"} Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.038719 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16d52263-b283-45be-9cb8-c3e921dba158" containerName="glance-log" containerID="cri-o://a8d0e99fc180c7ca27debe514006dca82b4074e304a1a4d99f2fe79de29667a5" gracePeriod=30 Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.038754 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16d52263-b283-45be-9cb8-c3e921dba158" containerName="glance-httpd" containerID="cri-o://f9c410da6adcacbec16bd4ead875bc2e39f4b0a65bfe6c4191403a1e940304c9" gracePeriod=30 Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.067287 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.067270177 podStartE2EDuration="5.067270177s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:25.057743775 +0000 UTC m=+1000.689192543" watchObservedRunningTime="2025-11-27 16:20:25.067270177 +0000 UTC m=+1000.698718945" Nov 27 16:20:25 crc kubenswrapper[4707]: I1127 16:20:25.084020 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.083999343 podStartE2EDuration="5.083999343s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:25.078655963 +0000 UTC m=+1000.710104731" watchObservedRunningTime="2025-11-27 16:20:25.083999343 +0000 UTC m=+1000.715448111" Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.048346 4707 generic.go:334] "Generic (PLEG): container finished" podID="06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" containerID="712557d6b5ee3c01c306eeb1e653fa4c69f6c8eec74fff38fbaa551d2cf27f5c" exitCode=0 Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.048509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x8n5z" event={"ID":"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c","Type":"ContainerDied","Data":"712557d6b5ee3c01c306eeb1e653fa4c69f6c8eec74fff38fbaa551d2cf27f5c"} Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.053117 4707 generic.go:334] "Generic (PLEG): container finished" podID="16d52263-b283-45be-9cb8-c3e921dba158" containerID="f9c410da6adcacbec16bd4ead875bc2e39f4b0a65bfe6c4191403a1e940304c9" exitCode=143 Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.053150 4707 generic.go:334] "Generic (PLEG): container finished" podID="16d52263-b283-45be-9cb8-c3e921dba158" containerID="a8d0e99fc180c7ca27debe514006dca82b4074e304a1a4d99f2fe79de29667a5" exitCode=143 Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.053199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d52263-b283-45be-9cb8-c3e921dba158","Type":"ContainerDied","Data":"f9c410da6adcacbec16bd4ead875bc2e39f4b0a65bfe6c4191403a1e940304c9"} Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.053226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d52263-b283-45be-9cb8-c3e921dba158","Type":"ContainerDied","Data":"a8d0e99fc180c7ca27debe514006dca82b4074e304a1a4d99f2fe79de29667a5"} Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.055644 4707 generic.go:334] "Generic (PLEG): container finished" podID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerID="3e5b6e94305e2dba2a8d8db42053becfe3c1d0a386bf4f648c16cec75bff48f3" exitCode=0 Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.055669 4707 generic.go:334] "Generic (PLEG): container finished" podID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerID="e4bd368daa5956eba42a3efa5d84ccee2790b82b04a1662b63f9b270595bd24a" exitCode=143 Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.055690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e032a3b3-8bee-42e1-a645-17b1151c85d1","Type":"ContainerDied","Data":"3e5b6e94305e2dba2a8d8db42053becfe3c1d0a386bf4f648c16cec75bff48f3"} Nov 27 16:20:26 crc kubenswrapper[4707]: I1127 16:20:26.055712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e032a3b3-8bee-42e1-a645-17b1151c85d1","Type":"ContainerDied","Data":"e4bd368daa5956eba42a3efa5d84ccee2790b82b04a1662b63f9b270595bd24a"} Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.816690 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.888278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-combined-ca-bundle\") pod \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.888387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-credential-keys\") pod \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.888442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddm5r\" (UniqueName: \"kubernetes.io/projected/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-kube-api-access-ddm5r\") pod \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.888521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-scripts\") pod \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.888586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-config-data\") pod \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.888615 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-fernet-keys\") pod \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\" (UID: \"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c\") " Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.894402 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" (UID: "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.894532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-scripts" (OuterVolumeSpecName: "scripts") pod "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" (UID: "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.894831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-kube-api-access-ddm5r" (OuterVolumeSpecName: "kube-api-access-ddm5r") pod "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" (UID: "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c"). InnerVolumeSpecName "kube-api-access-ddm5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.894922 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" (UID: "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.942359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" (UID: "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.949757 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-config-data" (OuterVolumeSpecName: "config-data") pod "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" (UID: "06bc7709-2c0c-4b87-8e4f-11e2e1a1726c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.990455 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.990749 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddm5r\" (UniqueName: \"kubernetes.io/projected/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-kube-api-access-ddm5r\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.990762 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.990771 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.990779 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:28 crc kubenswrapper[4707]: I1127 16:20:28.990788 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.072480 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b989k"] Nov 27 16:20:29 crc kubenswrapper[4707]: E1127 16:20:29.072879 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" containerName="dnsmasq-dns" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.072892 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" containerName="dnsmasq-dns" Nov 27 16:20:29 crc kubenswrapper[4707]: E1127 16:20:29.072906 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="registry-server" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.072914 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="registry-server" Nov 27 16:20:29 crc kubenswrapper[4707]: E1127 16:20:29.072934 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" containerName="init" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.072942 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" containerName="init" Nov 27 16:20:29 crc kubenswrapper[4707]: E1127 16:20:29.072961 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90b9ea4-6cc9-48e1-b89c-bfa648577c70" containerName="init" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.072968 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90b9ea4-6cc9-48e1-b89c-bfa648577c70" containerName="init" Nov 27 16:20:29 crc kubenswrapper[4707]: E1127 16:20:29.072987 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" containerName="keystone-bootstrap" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.072995 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" containerName="keystone-bootstrap" Nov 27 16:20:29 crc kubenswrapper[4707]: E1127 16:20:29.073009 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87e3212-fa52-4ef1-b9b0-6f4f3819da16" containerName="init" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.073018 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87e3212-fa52-4ef1-b9b0-6f4f3819da16" containerName="init" Nov 27 16:20:29 crc kubenswrapper[4707]: E1127 16:20:29.073038 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="extract-content" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.073045 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="extract-content" Nov 27 16:20:29 crc kubenswrapper[4707]: E1127 16:20:29.073063 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="extract-utilities" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.073072 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="extract-utilities" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.073270 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd4a36f0-fb55-45b1-a4d0-fd45fd3de731" containerName="dnsmasq-dns" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.073285 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90b9ea4-6cc9-48e1-b89c-bfa648577c70" containerName="init" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.073296 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87e3212-fa52-4ef1-b9b0-6f4f3819da16" containerName="init" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.073319 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" containerName="keystone-bootstrap" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.073341 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a68d54-eb44-474e-9c5f-1c2283a6d410" containerName="registry-server" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.076182 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.086194 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b989k"] Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.091908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x8n5z" event={"ID":"06bc7709-2c0c-4b87-8e4f-11e2e1a1726c","Type":"ContainerDied","Data":"e43132b4e965544d73e344ec669d3ba2607b5ed33d165e79f338e1fb98976dba"} Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.091949 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e43132b4e965544d73e344ec669d3ba2607b5ed33d165e79f338e1fb98976dba" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.092019 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x8n5z" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.193316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fjkb\" (UniqueName: \"kubernetes.io/projected/d6b92c04-fcc0-4d96-8200-3edd228dd326-kube-api-access-7fjkb\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.193399 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b92c04-fcc0-4d96-8200-3edd228dd326-utilities\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.193450 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b92c04-fcc0-4d96-8200-3edd228dd326-catalog-content\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.295089 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b92c04-fcc0-4d96-8200-3edd228dd326-utilities\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.295137 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b92c04-fcc0-4d96-8200-3edd228dd326-catalog-content\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.295306 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fjkb\" (UniqueName: \"kubernetes.io/projected/d6b92c04-fcc0-4d96-8200-3edd228dd326-kube-api-access-7fjkb\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.295797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b92c04-fcc0-4d96-8200-3edd228dd326-utilities\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.304745 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b92c04-fcc0-4d96-8200-3edd228dd326-catalog-content\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.317599 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fjkb\" (UniqueName: \"kubernetes.io/projected/d6b92c04-fcc0-4d96-8200-3edd228dd326-kube-api-access-7fjkb\") pod \"community-operators-b989k\" (UID: \"d6b92c04-fcc0-4d96-8200-3edd228dd326\") " pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.406235 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b989k" Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.901261 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x8n5z"] Nov 27 16:20:29 crc kubenswrapper[4707]: I1127 16:20:29.910487 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x8n5z"] Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.059302 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fh47h"] Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.060262 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.062271 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.062703 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.062774 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.063820 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.065124 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-drvd4" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.116179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-fernet-keys\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.116281 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-scripts\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.116310 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cccf\" (UniqueName: \"kubernetes.io/projected/4db357ae-5dd8-47c3-8f13-fd888df4fd42-kube-api-access-6cccf\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.116412 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-credential-keys\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.116433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-combined-ca-bundle\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.116503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-config-data\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.116723 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fh47h"] Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.217751 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-fernet-keys\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.218272 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-scripts\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.218329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cccf\" (UniqueName: \"kubernetes.io/projected/4db357ae-5dd8-47c3-8f13-fd888df4fd42-kube-api-access-6cccf\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.218399 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-credential-keys\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.218419 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-combined-ca-bundle\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.218490 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-config-data\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.222067 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-scripts\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.222490 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-credential-keys\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.226290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-fernet-keys\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.235538 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-config-data\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.235847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cccf\" (UniqueName: \"kubernetes.io/projected/4db357ae-5dd8-47c3-8f13-fd888df4fd42-kube-api-access-6cccf\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.239301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-combined-ca-bundle\") pod \"keystone-bootstrap-fh47h\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:30 crc kubenswrapper[4707]: I1127 16:20:30.386655 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:31 crc kubenswrapper[4707]: I1127 16:20:31.101493 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:20:31 crc kubenswrapper[4707]: I1127 16:20:31.216920 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06bc7709-2c0c-4b87-8e4f-11e2e1a1726c" path="/var/lib/kubelet/pods/06bc7709-2c0c-4b87-8e4f-11e2e1a1726c/volumes" Nov 27 16:20:31 crc kubenswrapper[4707]: I1127 16:20:31.217451 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kdgq7"] Nov 27 16:20:31 crc kubenswrapper[4707]: I1127 16:20:31.217710 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-kdgq7" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="dnsmasq-dns" containerID="cri-o://d95331cb09f33591d1df0d76198fbd67a3b71fc251fe031ec1da54f833d0221b" gracePeriod=10 Nov 27 16:20:32 crc kubenswrapper[4707]: I1127 16:20:32.191590 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerID="d95331cb09f33591d1df0d76198fbd67a3b71fc251fe031ec1da54f833d0221b" exitCode=0 Nov 27 16:20:32 crc kubenswrapper[4707]: I1127 16:20:32.191878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kdgq7" event={"ID":"7f3e04a7-5107-48e4-897a-8a126c0b2911","Type":"ContainerDied","Data":"d95331cb09f33591d1df0d76198fbd67a3b71fc251fe031ec1da54f833d0221b"} Nov 27 16:20:32 crc kubenswrapper[4707]: I1127 16:20:32.645333 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-kdgq7" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: connect: connection refused" Nov 27 16:20:42 crc kubenswrapper[4707]: I1127 16:20:42.646462 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-kdgq7" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: i/o timeout" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.515461 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.518520 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.647464 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-kdgq7" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: i/o timeout" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.647781 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681387 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-httpd-run\") pod \"16d52263-b283-45be-9cb8-c3e921dba158\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681457 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wzl2\" (UniqueName: \"kubernetes.io/projected/e032a3b3-8bee-42e1-a645-17b1151c85d1-kube-api-access-7wzl2\") pod \"e032a3b3-8bee-42e1-a645-17b1151c85d1\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681482 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-scripts\") pod \"e032a3b3-8bee-42e1-a645-17b1151c85d1\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-internal-tls-certs\") pod \"16d52263-b283-45be-9cb8-c3e921dba158\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-logs\") pod \"e032a3b3-8bee-42e1-a645-17b1151c85d1\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681637 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"16d52263-b283-45be-9cb8-c3e921dba158\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681686 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-logs\") pod \"16d52263-b283-45be-9cb8-c3e921dba158\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681740 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-httpd-run\") pod \"e032a3b3-8bee-42e1-a645-17b1151c85d1\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681769 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-config-data\") pod \"e032a3b3-8bee-42e1-a645-17b1151c85d1\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-public-tls-certs\") pod \"e032a3b3-8bee-42e1-a645-17b1151c85d1\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681817 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-combined-ca-bundle\") pod \"e032a3b3-8bee-42e1-a645-17b1151c85d1\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkmx8\" (UniqueName: \"kubernetes.io/projected/16d52263-b283-45be-9cb8-c3e921dba158-kube-api-access-tkmx8\") pod \"16d52263-b283-45be-9cb8-c3e921dba158\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681877 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-combined-ca-bundle\") pod \"16d52263-b283-45be-9cb8-c3e921dba158\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681899 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-config-data\") pod \"16d52263-b283-45be-9cb8-c3e921dba158\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681930 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-scripts\") pod \"16d52263-b283-45be-9cb8-c3e921dba158\" (UID: \"16d52263-b283-45be-9cb8-c3e921dba158\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.681964 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e032a3b3-8bee-42e1-a645-17b1151c85d1\" (UID: \"e032a3b3-8bee-42e1-a645-17b1151c85d1\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.682188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16d52263-b283-45be-9cb8-c3e921dba158" (UID: "16d52263-b283-45be-9cb8-c3e921dba158"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.682710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-logs" (OuterVolumeSpecName: "logs") pod "16d52263-b283-45be-9cb8-c3e921dba158" (UID: "16d52263-b283-45be-9cb8-c3e921dba158"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.682862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e032a3b3-8bee-42e1-a645-17b1151c85d1" (UID: "e032a3b3-8bee-42e1-a645-17b1151c85d1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.682974 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.682991 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16d52263-b283-45be-9cb8-c3e921dba158-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.682999 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.692652 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-logs" (OuterVolumeSpecName: "logs") pod "e032a3b3-8bee-42e1-a645-17b1151c85d1" (UID: "e032a3b3-8bee-42e1-a645-17b1151c85d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.693574 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16d52263-b283-45be-9cb8-c3e921dba158-kube-api-access-tkmx8" (OuterVolumeSpecName: "kube-api-access-tkmx8") pod "16d52263-b283-45be-9cb8-c3e921dba158" (UID: "16d52263-b283-45be-9cb8-c3e921dba158"). InnerVolumeSpecName "kube-api-access-tkmx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.696157 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "e032a3b3-8bee-42e1-a645-17b1151c85d1" (UID: "e032a3b3-8bee-42e1-a645-17b1151c85d1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.698594 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "16d52263-b283-45be-9cb8-c3e921dba158" (UID: "16d52263-b283-45be-9cb8-c3e921dba158"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.700532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-scripts" (OuterVolumeSpecName: "scripts") pod "e032a3b3-8bee-42e1-a645-17b1151c85d1" (UID: "e032a3b3-8bee-42e1-a645-17b1151c85d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.703598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e032a3b3-8bee-42e1-a645-17b1151c85d1-kube-api-access-7wzl2" (OuterVolumeSpecName: "kube-api-access-7wzl2") pod "e032a3b3-8bee-42e1-a645-17b1151c85d1" (UID: "e032a3b3-8bee-42e1-a645-17b1151c85d1"). InnerVolumeSpecName "kube-api-access-7wzl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.728570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16d52263-b283-45be-9cb8-c3e921dba158" (UID: "16d52263-b283-45be-9cb8-c3e921dba158"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.733454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e032a3b3-8bee-42e1-a645-17b1151c85d1" (UID: "e032a3b3-8bee-42e1-a645-17b1151c85d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.736659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-scripts" (OuterVolumeSpecName: "scripts") pod "16d52263-b283-45be-9cb8-c3e921dba158" (UID: "16d52263-b283-45be-9cb8-c3e921dba158"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.759875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-config-data" (OuterVolumeSpecName: "config-data") pod "16d52263-b283-45be-9cb8-c3e921dba158" (UID: "16d52263-b283-45be-9cb8-c3e921dba158"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.761201 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e032a3b3-8bee-42e1-a645-17b1151c85d1" (UID: "e032a3b3-8bee-42e1-a645-17b1151c85d1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.764458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-config-data" (OuterVolumeSpecName: "config-data") pod "e032a3b3-8bee-42e1-a645-17b1151c85d1" (UID: "e032a3b3-8bee-42e1-a645-17b1151c85d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.764473 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16d52263-b283-45be-9cb8-c3e921dba158" (UID: "16d52263-b283-45be-9cb8-c3e921dba158"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:47 crc kubenswrapper[4707]: E1127 16:20:47.783507 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Nov 27 16:20:47 crc kubenswrapper[4707]: E1127 16:20:47.783641 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82f4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-hmbzj_openstack(6661447e-fc28-4c9a-bcd1-66e15b0ca3fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:20:47 crc kubenswrapper[4707]: E1127 16:20:47.784974 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-hmbzj" podUID="6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785445 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785477 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785489 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785499 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkmx8\" (UniqueName: \"kubernetes.io/projected/16d52263-b283-45be-9cb8-c3e921dba158-kube-api-access-tkmx8\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785510 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785518 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785528 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785552 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785561 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wzl2\" (UniqueName: \"kubernetes.io/projected/e032a3b3-8bee-42e1-a645-17b1151c85d1-kube-api-access-7wzl2\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785570 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a3b3-8bee-42e1-a645-17b1151c85d1-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785578 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16d52263-b283-45be-9cb8-c3e921dba158-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785586 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e032a3b3-8bee-42e1-a645-17b1151c85d1-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.785598 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.801816 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.805741 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.806478 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.887321 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.887403 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.988696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-config\") pod \"7f3e04a7-5107-48e4-897a-8a126c0b2911\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.989773 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqdcw\" (UniqueName: \"kubernetes.io/projected/7f3e04a7-5107-48e4-897a-8a126c0b2911-kube-api-access-xqdcw\") pod \"7f3e04a7-5107-48e4-897a-8a126c0b2911\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.989886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-nb\") pod \"7f3e04a7-5107-48e4-897a-8a126c0b2911\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.990066 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-sb\") pod \"7f3e04a7-5107-48e4-897a-8a126c0b2911\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.990182 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-dns-svc\") pod \"7f3e04a7-5107-48e4-897a-8a126c0b2911\" (UID: \"7f3e04a7-5107-48e4-897a-8a126c0b2911\") " Nov 27 16:20:47 crc kubenswrapper[4707]: I1127 16:20:47.992912 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3e04a7-5107-48e4-897a-8a126c0b2911-kube-api-access-xqdcw" (OuterVolumeSpecName: "kube-api-access-xqdcw") pod "7f3e04a7-5107-48e4-897a-8a126c0b2911" (UID: "7f3e04a7-5107-48e4-897a-8a126c0b2911"). InnerVolumeSpecName "kube-api-access-xqdcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.026493 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-config" (OuterVolumeSpecName: "config") pod "7f3e04a7-5107-48e4-897a-8a126c0b2911" (UID: "7f3e04a7-5107-48e4-897a-8a126c0b2911"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.041723 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f3e04a7-5107-48e4-897a-8a126c0b2911" (UID: "7f3e04a7-5107-48e4-897a-8a126c0b2911"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.044283 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f3e04a7-5107-48e4-897a-8a126c0b2911" (UID: "7f3e04a7-5107-48e4-897a-8a126c0b2911"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.048104 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f3e04a7-5107-48e4-897a-8a126c0b2911" (UID: "7f3e04a7-5107-48e4-897a-8a126c0b2911"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.092914 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.092974 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqdcw\" (UniqueName: \"kubernetes.io/projected/7f3e04a7-5107-48e4-897a-8a126c0b2911-kube-api-access-xqdcw\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.092995 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.093016 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.093036 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f3e04a7-5107-48e4-897a-8a126c0b2911-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.356577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16d52263-b283-45be-9cb8-c3e921dba158","Type":"ContainerDied","Data":"d626d4a978f4a2ddab13d233a163c205deee5db91bc66c7632ee34fd9f580b09"} Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.356791 4707 scope.go:117] "RemoveContainer" containerID="f9c410da6adcacbec16bd4ead875bc2e39f4b0a65bfe6c4191403a1e940304c9" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.356844 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.361289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e032a3b3-8bee-42e1-a645-17b1151c85d1","Type":"ContainerDied","Data":"4901ebd5e0f1778a95e5a43e91cf0edcb633327fb5dfa7eff1c4e4a8c2ce611f"} Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.361313 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.364498 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-kdgq7" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.364501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-kdgq7" event={"ID":"7f3e04a7-5107-48e4-897a-8a126c0b2911","Type":"ContainerDied","Data":"71c60cd83198d2a103ec7104b0220cc38f55260931a79f3ccd2b52d2edbcb24b"} Nov 27 16:20:48 crc kubenswrapper[4707]: E1127 16:20:48.377483 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-hmbzj" podUID="6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.459666 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.486476 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.495004 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.505906 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513173 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:48 crc kubenswrapper[4707]: E1127 16:20:48.513572 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerName="glance-httpd" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513588 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerName="glance-httpd" Nov 27 16:20:48 crc kubenswrapper[4707]: E1127 16:20:48.513597 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="dnsmasq-dns" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513607 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="dnsmasq-dns" Nov 27 16:20:48 crc kubenswrapper[4707]: E1127 16:20:48.513624 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d52263-b283-45be-9cb8-c3e921dba158" containerName="glance-log" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513629 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d52263-b283-45be-9cb8-c3e921dba158" containerName="glance-log" Nov 27 16:20:48 crc kubenswrapper[4707]: E1127 16:20:48.513645 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16d52263-b283-45be-9cb8-c3e921dba158" containerName="glance-httpd" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513651 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d52263-b283-45be-9cb8-c3e921dba158" containerName="glance-httpd" Nov 27 16:20:48 crc kubenswrapper[4707]: E1127 16:20:48.513664 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="init" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513670 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="init" Nov 27 16:20:48 crc kubenswrapper[4707]: E1127 16:20:48.513680 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerName="glance-log" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513686 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerName="glance-log" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513856 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerName="glance-log" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513871 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" containerName="glance-httpd" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513885 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="dnsmasq-dns" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513895 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d52263-b283-45be-9cb8-c3e921dba158" containerName="glance-httpd" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.513902 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="16d52263-b283-45be-9cb8-c3e921dba158" containerName="glance-log" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.514814 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.518637 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.524759 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.527644 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wtmvr" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.527846 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.531041 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kdgq7"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.538694 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-kdgq7"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.545473 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.553429 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.554893 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.557026 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.557249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.570268 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.609012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.609079 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-logs\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.609187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.609264 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.609305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-844vh\" (UniqueName: \"kubernetes.io/projected/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-kube-api-access-844vh\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.609338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.609398 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.609476 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.711685 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-logs\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.711765 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.711812 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.711855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.711991 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.712106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.712815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-844vh\" (UniqueName: \"kubernetes.io/projected/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-kube-api-access-844vh\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.712879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.712922 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.712972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.713072 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-scripts\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.713114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.713168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-config-data\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.713243 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhddt\" (UniqueName: \"kubernetes.io/projected/5798d94c-5634-4d92-b36d-cab9231adfc4-kube-api-access-rhddt\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.713501 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.713556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-logs\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.714290 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.714858 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-logs\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.715504 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.716955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.718017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.723746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.734219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-844vh\" (UniqueName: \"kubernetes.io/projected/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-kube-api-access-844vh\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.734970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.748410 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.815430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-logs\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.815489 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.815514 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.815539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.815583 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-scripts\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.815601 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.815627 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-config-data\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.815646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhddt\" (UniqueName: \"kubernetes.io/projected/5798d94c-5634-4d92-b36d-cab9231adfc4-kube-api-access-rhddt\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.816392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-logs\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.816666 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.818615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.822631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.823050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-scripts\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.823989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.830471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-config-data\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.830972 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.831013 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhddt\" (UniqueName: \"kubernetes.io/projected/5798d94c-5634-4d92-b36d-cab9231adfc4-kube-api-access-rhddt\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.847712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:20:48 crc kubenswrapper[4707]: I1127 16:20:48.876982 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.205932 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d52263-b283-45be-9cb8-c3e921dba158" path="/var/lib/kubelet/pods/16d52263-b283-45be-9cb8-c3e921dba158/volumes" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.207284 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" path="/var/lib/kubelet/pods/7f3e04a7-5107-48e4-897a-8a126c0b2911/volumes" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.208204 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e032a3b3-8bee-42e1-a645-17b1151c85d1" path="/var/lib/kubelet/pods/e032a3b3-8bee-42e1-a645-17b1151c85d1/volumes" Nov 27 16:20:49 crc kubenswrapper[4707]: E1127 16:20:49.269246 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 27 16:20:49 crc kubenswrapper[4707]: E1127 16:20:49.269714 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ts9jh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gb8rd_openstack(1b8ce5ef-c2af-4e15-9677-8e878b96c4de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:20:49 crc kubenswrapper[4707]: E1127 16:20:49.271206 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gb8rd" podUID="1b8ce5ef-c2af-4e15-9677-8e878b96c4de" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.376142 4707 generic.go:334] "Generic (PLEG): container finished" podID="ffe6108f-9182-4d5b-b877-977da419fc7c" containerID="4cebbc21162ddd05ea2b220c368d5661cfb0b3e37d8da5c199a0c260c2bd13f1" exitCode=0 Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.376189 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ftcdx" event={"ID":"ffe6108f-9182-4d5b-b877-977da419fc7c","Type":"ContainerDied","Data":"4cebbc21162ddd05ea2b220c368d5661cfb0b3e37d8da5c199a0c260c2bd13f1"} Nov 27 16:20:49 crc kubenswrapper[4707]: E1127 16:20:49.378263 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-gb8rd" podUID="1b8ce5ef-c2af-4e15-9677-8e878b96c4de" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.558344 4707 scope.go:117] "RemoveContainer" containerID="a8d0e99fc180c7ca27debe514006dca82b4074e304a1a4d99f2fe79de29667a5" Nov 27 16:20:49 crc kubenswrapper[4707]: E1127 16:20:49.565022 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Nov 27 16:20:49 crc kubenswrapper[4707]: E1127 16:20:49.565178 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p5j7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-5k4zv_openstack(1c445b1d-7e63-48cc-83ee-c4841074701c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 16:20:49 crc kubenswrapper[4707]: E1127 16:20:49.566719 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-5k4zv" podUID="1c445b1d-7e63-48cc-83ee-c4841074701c" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.700231 4707 scope.go:117] "RemoveContainer" containerID="3e5b6e94305e2dba2a8d8db42053becfe3c1d0a386bf4f648c16cec75bff48f3" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.799474 4707 scope.go:117] "RemoveContainer" containerID="e4bd368daa5956eba42a3efa5d84ccee2790b82b04a1662b63f9b270595bd24a" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.817749 4707 scope.go:117] "RemoveContainer" containerID="d95331cb09f33591d1df0d76198fbd67a3b71fc251fe031ec1da54f833d0221b" Nov 27 16:20:49 crc kubenswrapper[4707]: I1127 16:20:49.838993 4707 scope.go:117] "RemoveContainer" containerID="6629abc42a630e4279c596f6faccb548cccb0fd2a0a73f19a940a1791e96b9c1" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.010191 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b989k"] Nov 27 16:20:50 crc kubenswrapper[4707]: W1127 16:20:50.017230 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4db357ae_5dd8_47c3_8f13_fd888df4fd42.slice/crio-a5a1ee29436217e2797b88d153a2ba14fcb5d7d85ed7f5809ca2ccc362fe12bc WatchSource:0}: Error finding container a5a1ee29436217e2797b88d153a2ba14fcb5d7d85ed7f5809ca2ccc362fe12bc: Status 404 returned error can't find the container with id a5a1ee29436217e2797b88d153a2ba14fcb5d7d85ed7f5809ca2ccc362fe12bc Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.021241 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fh47h"] Nov 27 16:20:50 crc kubenswrapper[4707]: W1127 16:20:50.024310 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b92c04_fcc0_4d96_8200_3edd228dd326.slice/crio-e2b58e44c11c0f995e0795d487900394cac55f97077b81f3d4a2da271a18e5e1 WatchSource:0}: Error finding container e2b58e44c11c0f995e0795d487900394cac55f97077b81f3d4a2da271a18e5e1: Status 404 returned error can't find the container with id e2b58e44c11c0f995e0795d487900394cac55f97077b81f3d4a2da271a18e5e1 Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.024729 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.227038 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.327587 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:20:50 crc kubenswrapper[4707]: W1127 16:20:50.333164 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5798d94c_5634_4d92_b36d_cab9231adfc4.slice/crio-8df54de512afb59c274207d4ddd2f1c9bfcd66a123c7c4fed094458644fc75c2 WatchSource:0}: Error finding container 8df54de512afb59c274207d4ddd2f1c9bfcd66a123c7c4fed094458644fc75c2: Status 404 returned error can't find the container with id 8df54de512afb59c274207d4ddd2f1c9bfcd66a123c7c4fed094458644fc75c2 Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.386980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerStarted","Data":"b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2"} Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.392659 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96ce5955-aed6-45e9-8a8a-4cb59d3a511d","Type":"ContainerStarted","Data":"a9c5c5b6863ea3e3e0c3efd9b28f1d4bda63f7c1ea4fae2d6fa6be9fa25684e5"} Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.395987 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6b92c04-fcc0-4d96-8200-3edd228dd326" containerID="129a40af8cdd0bacb79f954ed904d65a5c28255ba595b13607d0c1c02176c4a7" exitCode=0 Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.396157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b989k" event={"ID":"d6b92c04-fcc0-4d96-8200-3edd228dd326","Type":"ContainerDied","Data":"129a40af8cdd0bacb79f954ed904d65a5c28255ba595b13607d0c1c02176c4a7"} Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.398122 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b989k" event={"ID":"d6b92c04-fcc0-4d96-8200-3edd228dd326","Type":"ContainerStarted","Data":"e2b58e44c11c0f995e0795d487900394cac55f97077b81f3d4a2da271a18e5e1"} Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.402925 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fh47h" event={"ID":"4db357ae-5dd8-47c3-8f13-fd888df4fd42","Type":"ContainerStarted","Data":"550123f8283cb9092f7db11db6ceecbd9443975883983e2c8b5d913c7e618bd1"} Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.403063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fh47h" event={"ID":"4db357ae-5dd8-47c3-8f13-fd888df4fd42","Type":"ContainerStarted","Data":"a5a1ee29436217e2797b88d153a2ba14fcb5d7d85ed7f5809ca2ccc362fe12bc"} Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.406995 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5798d94c-5634-4d92-b36d-cab9231adfc4","Type":"ContainerStarted","Data":"8df54de512afb59c274207d4ddd2f1c9bfcd66a123c7c4fed094458644fc75c2"} Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.409962 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qzltn" event={"ID":"525c317f-60e5-4359-bdd4-62caf9f54b38","Type":"ContainerStarted","Data":"672c7b06415373aac80def8df626058032b536c97c6101380abddd2763491cbe"} Nov 27 16:20:50 crc kubenswrapper[4707]: E1127 16:20:50.423094 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-5k4zv" podUID="1c445b1d-7e63-48cc-83ee-c4841074701c" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.439501 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-qzltn" podStartSLOduration=2.791816199 podStartE2EDuration="30.43947783s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="2025-11-27 16:20:21.911329914 +0000 UTC m=+997.542778682" lastFinishedPulling="2025-11-27 16:20:49.558991535 +0000 UTC m=+1025.190440313" observedRunningTime="2025-11-27 16:20:50.436841376 +0000 UTC m=+1026.068290154" watchObservedRunningTime="2025-11-27 16:20:50.43947783 +0000 UTC m=+1026.070926598" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.493695 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fh47h" podStartSLOduration=20.493677388 podStartE2EDuration="20.493677388s" podCreationTimestamp="2025-11-27 16:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:50.489958088 +0000 UTC m=+1026.121406856" watchObservedRunningTime="2025-11-27 16:20:50.493677388 +0000 UTC m=+1026.125126156" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.765326 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.860998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq749\" (UniqueName: \"kubernetes.io/projected/ffe6108f-9182-4d5b-b877-977da419fc7c-kube-api-access-tq749\") pod \"ffe6108f-9182-4d5b-b877-977da419fc7c\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.861301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-config\") pod \"ffe6108f-9182-4d5b-b877-977da419fc7c\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.861333 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-combined-ca-bundle\") pod \"ffe6108f-9182-4d5b-b877-977da419fc7c\" (UID: \"ffe6108f-9182-4d5b-b877-977da419fc7c\") " Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.890038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe6108f-9182-4d5b-b877-977da419fc7c-kube-api-access-tq749" (OuterVolumeSpecName: "kube-api-access-tq749") pod "ffe6108f-9182-4d5b-b877-977da419fc7c" (UID: "ffe6108f-9182-4d5b-b877-977da419fc7c"). InnerVolumeSpecName "kube-api-access-tq749". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.902585 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffe6108f-9182-4d5b-b877-977da419fc7c" (UID: "ffe6108f-9182-4d5b-b877-977da419fc7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.903025 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-config" (OuterVolumeSpecName: "config") pod "ffe6108f-9182-4d5b-b877-977da419fc7c" (UID: "ffe6108f-9182-4d5b-b877-977da419fc7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.963109 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq749\" (UniqueName: \"kubernetes.io/projected/ffe6108f-9182-4d5b-b877-977da419fc7c-kube-api-access-tq749\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.963144 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:50 crc kubenswrapper[4707]: I1127 16:20:50.963159 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffe6108f-9182-4d5b-b877-977da419fc7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.425411 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5798d94c-5634-4d92-b36d-cab9231adfc4","Type":"ContainerStarted","Data":"62bdd03ab195ad3612cdfb77283c1c12d9f5af412bb3d68830b00c6e0c23d85c"} Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.425653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5798d94c-5634-4d92-b36d-cab9231adfc4","Type":"ContainerStarted","Data":"81d2c1c2557632dd94e53405c847f196729c846144d2ec8a8ef1580865599241"} Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.432067 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96ce5955-aed6-45e9-8a8a-4cb59d3a511d","Type":"ContainerStarted","Data":"ddb0e96230fc02b7f5dd232f2685f6135a1035250566123f6bef903bf2011d33"} Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.432104 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96ce5955-aed6-45e9-8a8a-4cb59d3a511d","Type":"ContainerStarted","Data":"a2457030a9961af5854c83e5a20f52d3a0a20b2683fc3cd24f97b345bd69a877"} Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.441800 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ftcdx" event={"ID":"ffe6108f-9182-4d5b-b877-977da419fc7c","Type":"ContainerDied","Data":"0f86c145fdacbecaa283ec4e928ba252fb7bf494456439ef725921ab23b36402"} Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.441860 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f86c145fdacbecaa283ec4e928ba252fb7bf494456439ef725921ab23b36402" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.442926 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ftcdx" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.460559 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.460543833 podStartE2EDuration="3.460543833s" podCreationTimestamp="2025-11-27 16:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:51.451157305 +0000 UTC m=+1027.082606063" watchObservedRunningTime="2025-11-27 16:20:51.460543833 +0000 UTC m=+1027.091992601" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.485165 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.4851425320000002 podStartE2EDuration="3.485142532s" podCreationTimestamp="2025-11-27 16:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:51.475011795 +0000 UTC m=+1027.106460563" watchObservedRunningTime="2025-11-27 16:20:51.485142532 +0000 UTC m=+1027.116591300" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.659723 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnnrf"] Nov 27 16:20:51 crc kubenswrapper[4707]: E1127 16:20:51.660275 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe6108f-9182-4d5b-b877-977da419fc7c" containerName="neutron-db-sync" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.660289 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe6108f-9182-4d5b-b877-977da419fc7c" containerName="neutron-db-sync" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.660462 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe6108f-9182-4d5b-b877-977da419fc7c" containerName="neutron-db-sync" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.661266 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.696223 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnnrf"] Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.742996 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-784ccffcb8-pjrzr"] Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.744275 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.749441 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.749695 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qc5rx" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.749824 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.750022 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.783404 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ntt5\" (UniqueName: \"kubernetes.io/projected/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-kube-api-access-4ntt5\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.783483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-config\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.783507 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.783543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.783688 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.783736 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.791768 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-784ccffcb8-pjrzr"] Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.885651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ntt5\" (UniqueName: \"kubernetes.io/projected/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-kube-api-access-4ntt5\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.885931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-config\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.885950 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.885981 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.886006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-ovndb-tls-certs\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.886030 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8d5h\" (UniqueName: \"kubernetes.io/projected/0c553d55-f2dd-404c-bb41-379922a29a20-kube-api-access-l8d5h\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.886071 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-httpd-config\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.886086 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-combined-ca-bundle\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.886109 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.886138 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-config\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.886158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.886975 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.887916 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-config\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.888899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.890125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.890301 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.904378 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ntt5\" (UniqueName: \"kubernetes.io/projected/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-kube-api-access-4ntt5\") pod \"dnsmasq-dns-55f844cf75-tnnrf\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.980243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.987322 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-httpd-config\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.987360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-combined-ca-bundle\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.987414 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-config\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.987531 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-ovndb-tls-certs\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.987559 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8d5h\" (UniqueName: \"kubernetes.io/projected/0c553d55-f2dd-404c-bb41-379922a29a20-kube-api-access-l8d5h\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.990831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-httpd-config\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.991016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-combined-ca-bundle\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.991907 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-ovndb-tls-certs\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:51 crc kubenswrapper[4707]: I1127 16:20:51.999050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-config\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:52 crc kubenswrapper[4707]: I1127 16:20:52.009756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8d5h\" (UniqueName: \"kubernetes.io/projected/0c553d55-f2dd-404c-bb41-379922a29a20-kube-api-access-l8d5h\") pod \"neutron-784ccffcb8-pjrzr\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:52 crc kubenswrapper[4707]: I1127 16:20:52.083024 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:52 crc kubenswrapper[4707]: I1127 16:20:52.452075 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerStarted","Data":"350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78"} Nov 27 16:20:52 crc kubenswrapper[4707]: I1127 16:20:52.519979 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnnrf"] Nov 27 16:20:52 crc kubenswrapper[4707]: W1127 16:20:52.527230 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f3d64e0_6a2f_49d8_9f92_9ea645023db5.slice/crio-06d3cd6a7e9b415fd70045baf9c296878d01e5fdc901d5c3d6afa7c94b5716cf WatchSource:0}: Error finding container 06d3cd6a7e9b415fd70045baf9c296878d01e5fdc901d5c3d6afa7c94b5716cf: Status 404 returned error can't find the container with id 06d3cd6a7e9b415fd70045baf9c296878d01e5fdc901d5c3d6afa7c94b5716cf Nov 27 16:20:52 crc kubenswrapper[4707]: I1127 16:20:52.647815 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-kdgq7" podUID="7f3e04a7-5107-48e4-897a-8a126c0b2911" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: i/o timeout" Nov 27 16:20:52 crc kubenswrapper[4707]: I1127 16:20:52.699974 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-784ccffcb8-pjrzr"] Nov 27 16:20:52 crc kubenswrapper[4707]: W1127 16:20:52.730569 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c553d55_f2dd_404c_bb41_379922a29a20.slice/crio-3f2651b3b95d241d6cd589498ba66ab0a091f48da4bf3df45eb293072e249c10 WatchSource:0}: Error finding container 3f2651b3b95d241d6cd589498ba66ab0a091f48da4bf3df45eb293072e249c10: Status 404 returned error can't find the container with id 3f2651b3b95d241d6cd589498ba66ab0a091f48da4bf3df45eb293072e249c10 Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.461129 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-784ccffcb8-pjrzr" event={"ID":"0c553d55-f2dd-404c-bb41-379922a29a20","Type":"ContainerStarted","Data":"a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb"} Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.461428 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-784ccffcb8-pjrzr" event={"ID":"0c553d55-f2dd-404c-bb41-379922a29a20","Type":"ContainerStarted","Data":"8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2"} Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.461439 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-784ccffcb8-pjrzr" event={"ID":"0c553d55-f2dd-404c-bb41-379922a29a20","Type":"ContainerStarted","Data":"3f2651b3b95d241d6cd589498ba66ab0a091f48da4bf3df45eb293072e249c10"} Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.461935 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.464390 4707 generic.go:334] "Generic (PLEG): container finished" podID="525c317f-60e5-4359-bdd4-62caf9f54b38" containerID="672c7b06415373aac80def8df626058032b536c97c6101380abddd2763491cbe" exitCode=0 Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.464434 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qzltn" event={"ID":"525c317f-60e5-4359-bdd4-62caf9f54b38","Type":"ContainerDied","Data":"672c7b06415373aac80def8df626058032b536c97c6101380abddd2763491cbe"} Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.470095 4707 generic.go:334] "Generic (PLEG): container finished" podID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerID="f322e48250a951f6df738ce98dad974d7f8a61257d38fd29f1d379f305ab05bd" exitCode=0 Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.470147 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" event={"ID":"0f3d64e0-6a2f-49d8-9f92-9ea645023db5","Type":"ContainerDied","Data":"f322e48250a951f6df738ce98dad974d7f8a61257d38fd29f1d379f305ab05bd"} Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.470171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" event={"ID":"0f3d64e0-6a2f-49d8-9f92-9ea645023db5","Type":"ContainerStarted","Data":"06d3cd6a7e9b415fd70045baf9c296878d01e5fdc901d5c3d6afa7c94b5716cf"} Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.483996 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-784ccffcb8-pjrzr" podStartSLOduration=2.483976304 podStartE2EDuration="2.483976304s" podCreationTimestamp="2025-11-27 16:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:53.476544933 +0000 UTC m=+1029.107993711" watchObservedRunningTime="2025-11-27 16:20:53.483976304 +0000 UTC m=+1029.115425062" Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.484822 4707 generic.go:334] "Generic (PLEG): container finished" podID="4db357ae-5dd8-47c3-8f13-fd888df4fd42" containerID="550123f8283cb9092f7db11db6ceecbd9443975883983e2c8b5d913c7e618bd1" exitCode=0 Nov 27 16:20:53 crc kubenswrapper[4707]: I1127 16:20:53.484864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fh47h" event={"ID":"4db357ae-5dd8-47c3-8f13-fd888df4fd42","Type":"ContainerDied","Data":"550123f8283cb9092f7db11db6ceecbd9443975883983e2c8b5d913c7e618bd1"} Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.069859 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69588f8b9-6plc2"] Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.071525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.077195 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69588f8b9-6plc2"] Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.078332 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.078721 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.141362 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-internal-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.141456 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9hgq\" (UniqueName: \"kubernetes.io/projected/5de06db5-fa17-40e5-a08d-9b7f139b08ed-kube-api-access-h9hgq\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.141523 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-config\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.141547 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-combined-ca-bundle\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.141618 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-ovndb-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.141705 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-public-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.141770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-httpd-config\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.243310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-httpd-config\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.243400 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-internal-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.243450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9hgq\" (UniqueName: \"kubernetes.io/projected/5de06db5-fa17-40e5-a08d-9b7f139b08ed-kube-api-access-h9hgq\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.243509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-config\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.243525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-combined-ca-bundle\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.243558 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-ovndb-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.243596 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-public-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.248645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-httpd-config\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.249092 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-internal-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.250045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-combined-ca-bundle\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.250698 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-ovndb-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.251073 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-config\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.251713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de06db5-fa17-40e5-a08d-9b7f139b08ed-public-tls-certs\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.269192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9hgq\" (UniqueName: \"kubernetes.io/projected/5de06db5-fa17-40e5-a08d-9b7f139b08ed-kube-api-access-h9hgq\") pod \"neutron-69588f8b9-6plc2\" (UID: \"5de06db5-fa17-40e5-a08d-9b7f139b08ed\") " pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.393053 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.510306 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" event={"ID":"0f3d64e0-6a2f-49d8-9f92-9ea645023db5","Type":"ContainerStarted","Data":"e3ee6089f2545043c5b9019c8751c5b067980cc5243bc0f7e5c9b3dfcc5bab80"} Nov 27 16:20:54 crc kubenswrapper[4707]: I1127 16:20:54.543257 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" podStartSLOduration=3.5432374859999998 podStartE2EDuration="3.543237486s" podCreationTimestamp="2025-11-27 16:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:20:54.535797105 +0000 UTC m=+1030.167245873" watchObservedRunningTime="2025-11-27 16:20:54.543237486 +0000 UTC m=+1030.174686254" Nov 27 16:20:55 crc kubenswrapper[4707]: I1127 16:20:55.517717 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.815939 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.822692 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.906595 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-config-data\") pod \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.906723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cccf\" (UniqueName: \"kubernetes.io/projected/4db357ae-5dd8-47c3-8f13-fd888df4fd42-kube-api-access-6cccf\") pod \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.906829 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-db-sync-config-data\") pod \"525c317f-60e5-4359-bdd4-62caf9f54b38\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.906918 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-credential-keys\") pod \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.907083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-combined-ca-bundle\") pod \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.907125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-fernet-keys\") pod \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.907153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-scripts\") pod \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\" (UID: \"4db357ae-5dd8-47c3-8f13-fd888df4fd42\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.907185 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2dnn\" (UniqueName: \"kubernetes.io/projected/525c317f-60e5-4359-bdd4-62caf9f54b38-kube-api-access-x2dnn\") pod \"525c317f-60e5-4359-bdd4-62caf9f54b38\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.907295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-combined-ca-bundle\") pod \"525c317f-60e5-4359-bdd4-62caf9f54b38\" (UID: \"525c317f-60e5-4359-bdd4-62caf9f54b38\") " Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.919049 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "525c317f-60e5-4359-bdd4-62caf9f54b38" (UID: "525c317f-60e5-4359-bdd4-62caf9f54b38"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.919878 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4db357ae-5dd8-47c3-8f13-fd888df4fd42-kube-api-access-6cccf" (OuterVolumeSpecName: "kube-api-access-6cccf") pod "4db357ae-5dd8-47c3-8f13-fd888df4fd42" (UID: "4db357ae-5dd8-47c3-8f13-fd888df4fd42"). InnerVolumeSpecName "kube-api-access-6cccf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.920948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4db357ae-5dd8-47c3-8f13-fd888df4fd42" (UID: "4db357ae-5dd8-47c3-8f13-fd888df4fd42"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.921533 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4db357ae-5dd8-47c3-8f13-fd888df4fd42" (UID: "4db357ae-5dd8-47c3-8f13-fd888df4fd42"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.926685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525c317f-60e5-4359-bdd4-62caf9f54b38-kube-api-access-x2dnn" (OuterVolumeSpecName: "kube-api-access-x2dnn") pod "525c317f-60e5-4359-bdd4-62caf9f54b38" (UID: "525c317f-60e5-4359-bdd4-62caf9f54b38"). InnerVolumeSpecName "kube-api-access-x2dnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.930915 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-scripts" (OuterVolumeSpecName: "scripts") pod "4db357ae-5dd8-47c3-8f13-fd888df4fd42" (UID: "4db357ae-5dd8-47c3-8f13-fd888df4fd42"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.959643 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4db357ae-5dd8-47c3-8f13-fd888df4fd42" (UID: "4db357ae-5dd8-47c3-8f13-fd888df4fd42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.965884 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-config-data" (OuterVolumeSpecName: "config-data") pod "4db357ae-5dd8-47c3-8f13-fd888df4fd42" (UID: "4db357ae-5dd8-47c3-8f13-fd888df4fd42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:56 crc kubenswrapper[4707]: I1127 16:20:56.971936 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "525c317f-60e5-4359-bdd4-62caf9f54b38" (UID: "525c317f-60e5-4359-bdd4-62caf9f54b38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.010864 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.010938 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.010951 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.010965 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2dnn\" (UniqueName: \"kubernetes.io/projected/525c317f-60e5-4359-bdd4-62caf9f54b38-kube-api-access-x2dnn\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.010981 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.010994 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.011009 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cccf\" (UniqueName: \"kubernetes.io/projected/4db357ae-5dd8-47c3-8f13-fd888df4fd42-kube-api-access-6cccf\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.011021 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/525c317f-60e5-4359-bdd4-62caf9f54b38-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.011032 4707 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4db357ae-5dd8-47c3-8f13-fd888df4fd42-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.542774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fh47h" event={"ID":"4db357ae-5dd8-47c3-8f13-fd888df4fd42","Type":"ContainerDied","Data":"a5a1ee29436217e2797b88d153a2ba14fcb5d7d85ed7f5809ca2ccc362fe12bc"} Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.542825 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a1ee29436217e2797b88d153a2ba14fcb5d7d85ed7f5809ca2ccc362fe12bc" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.542937 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fh47h" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.545168 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-qzltn" event={"ID":"525c317f-60e5-4359-bdd4-62caf9f54b38","Type":"ContainerDied","Data":"440d28a5d4b638c2070005a98e116276f229010931d81c83ebb36f02a46d5f81"} Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.545230 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="440d28a5d4b638c2070005a98e116276f229010931d81c83ebb36f02a46d5f81" Nov 27 16:20:57 crc kubenswrapper[4707]: I1127 16:20:57.545251 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-qzltn" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.042556 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-85c75486d4-pmkdv"] Nov 27 16:20:58 crc kubenswrapper[4707]: E1127 16:20:58.043044 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db357ae-5dd8-47c3-8f13-fd888df4fd42" containerName="keystone-bootstrap" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.043058 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db357ae-5dd8-47c3-8f13-fd888df4fd42" containerName="keystone-bootstrap" Nov 27 16:20:58 crc kubenswrapper[4707]: E1127 16:20:58.043085 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525c317f-60e5-4359-bdd4-62caf9f54b38" containerName="barbican-db-sync" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.043091 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="525c317f-60e5-4359-bdd4-62caf9f54b38" containerName="barbican-db-sync" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.043308 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4db357ae-5dd8-47c3-8f13-fd888df4fd42" containerName="keystone-bootstrap" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.043333 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="525c317f-60e5-4359-bdd4-62caf9f54b38" containerName="barbican-db-sync" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.044101 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.045957 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.046856 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.046975 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.047274 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.047431 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-drvd4" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.047564 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.055555 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85c75486d4-pmkdv"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.108294 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-567875797-q64rz"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.111222 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.115723 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.115957 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.116287 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gcgkn" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.131980 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-567875797-q64rz"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.133293 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8x65\" (UniqueName: \"kubernetes.io/projected/af1b5c91-e184-49fe-9ad9-83f047d5123d-kube-api-access-v8x65\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.133343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-internal-tls-certs\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.133432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-scripts\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.133497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-config-data\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.133518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-public-tls-certs\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.133543 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-credential-keys\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.133602 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-fernet-keys\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.133676 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-combined-ca-bundle\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.173781 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6c948b485d-2wcq8"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.183612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.190708 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.242305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-config-data\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251072 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-config-data\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251173 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-public-tls-certs\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251225 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-credential-keys\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcqt\" (UniqueName: \"kubernetes.io/projected/e1955b85-6ed8-492c-8001-c4fc20da8270-kube-api-access-rfcqt\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251439 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1955b85-6ed8-492c-8001-c4fc20da8270-logs\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251575 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-fernet-keys\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-config-data\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-config-data-custom\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-combined-ca-bundle\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251922 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-combined-ca-bundle\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.251995 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hx4j\" (UniqueName: \"kubernetes.io/projected/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-kube-api-access-4hx4j\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.252042 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8x65\" (UniqueName: \"kubernetes.io/projected/af1b5c91-e184-49fe-9ad9-83f047d5123d-kube-api-access-v8x65\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.252114 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-internal-tls-certs\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.252225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-combined-ca-bundle\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.252300 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-logs\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.252336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-config-data-custom\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.252385 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-scripts\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.258939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-credential-keys\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.262880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-public-tls-certs\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.268477 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-config-data\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.268866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-scripts\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.279089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-internal-tls-certs\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.307294 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-fernet-keys\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.307459 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b5c91-e184-49fe-9ad9-83f047d5123d-combined-ca-bundle\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.311239 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c948b485d-2wcq8"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.316002 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8x65\" (UniqueName: \"kubernetes.io/projected/af1b5c91-e184-49fe-9ad9-83f047d5123d-kube-api-access-v8x65\") pod \"keystone-85c75486d4-pmkdv\" (UID: \"af1b5c91-e184-49fe-9ad9-83f047d5123d\") " pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.351569 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnnrf"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.351832 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" podUID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerName="dnsmasq-dns" containerID="cri-o://e3ee6089f2545043c5b9019c8751c5b067980cc5243bc0f7e5c9b3dfcc5bab80" gracePeriod=10 Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.352540 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.353795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-combined-ca-bundle\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.353833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-logs\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.353852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-config-data-custom\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.353887 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-config-data\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.353916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcqt\" (UniqueName: \"kubernetes.io/projected/e1955b85-6ed8-492c-8001-c4fc20da8270-kube-api-access-rfcqt\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.353934 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1955b85-6ed8-492c-8001-c4fc20da8270-logs\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.353969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-config-data\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.353989 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-config-data-custom\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.354014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-combined-ca-bundle\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.354037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hx4j\" (UniqueName: \"kubernetes.io/projected/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-kube-api-access-4hx4j\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.354675 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1955b85-6ed8-492c-8001-c4fc20da8270-logs\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.354684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-logs\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.360219 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-combined-ca-bundle\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.360680 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-config-data\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.360710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-combined-ca-bundle\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.361008 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.361096 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-config-data-custom\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.363155 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-config-data-custom\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.369531 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1955b85-6ed8-492c-8001-c4fc20da8270-config-data\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.372812 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hx4j\" (UniqueName: \"kubernetes.io/projected/d1df02f7-1e71-4fae-a6df-cb3c83460a7e-kube-api-access-4hx4j\") pod \"barbican-keystone-listener-6c948b485d-2wcq8\" (UID: \"d1df02f7-1e71-4fae-a6df-cb3c83460a7e\") " pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.383625 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c99d8d4cb-fc8zk"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.385192 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.388044 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.389117 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcqt\" (UniqueName: \"kubernetes.io/projected/e1955b85-6ed8-492c-8001-c4fc20da8270-kube-api-access-rfcqt\") pod \"barbican-worker-567875797-q64rz\" (UID: \"e1955b85-6ed8-492c-8001-c4fc20da8270\") " pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.395985 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-zhndc"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.397548 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.406596 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-zhndc"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.413582 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c99d8d4cb-fc8zk"] Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.455842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.455891 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzw4x\" (UniqueName: \"kubernetes.io/projected/798010e9-c58d-45c7-a41c-d9ff9693d662-kube-api-access-pzw4x\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.455918 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/798010e9-c58d-45c7-a41c-d9ff9693d662-logs\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.455941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-config\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.455982 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.456003 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.456034 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.456089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-svc\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.456114 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799q4\" (UniqueName: \"kubernetes.io/projected/6c94a5dc-169c-47ba-a007-307336246c92-kube-api-access-799q4\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.456144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data-custom\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.456184 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-combined-ca-bundle\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.469912 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-567875797-q64rz" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.555400 4707 generic.go:334] "Generic (PLEG): container finished" podID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerID="e3ee6089f2545043c5b9019c8751c5b067980cc5243bc0f7e5c9b3dfcc5bab80" exitCode=0 Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.555440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" event={"ID":"0f3d64e0-6a2f-49d8-9f92-9ea645023db5","Type":"ContainerDied","Data":"e3ee6089f2545043c5b9019c8751c5b067980cc5243bc0f7e5c9b3dfcc5bab80"} Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557683 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557728 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-svc\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557816 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799q4\" (UniqueName: \"kubernetes.io/projected/6c94a5dc-169c-47ba-a007-307336246c92-kube-api-access-799q4\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557847 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data-custom\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-combined-ca-bundle\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557944 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzw4x\" (UniqueName: \"kubernetes.io/projected/798010e9-c58d-45c7-a41c-d9ff9693d662-kube-api-access-pzw4x\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557967 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/798010e9-c58d-45c7-a41c-d9ff9693d662-logs\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.557996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-config\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.559065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-config\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.559190 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.559647 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.560071 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-svc\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.561786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/798010e9-c58d-45c7-a41c-d9ff9693d662-logs\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.561859 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.562985 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.565955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data-custom\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.570772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-combined-ca-bundle\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.572457 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.574169 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzw4x\" (UniqueName: \"kubernetes.io/projected/798010e9-c58d-45c7-a41c-d9ff9693d662-kube-api-access-pzw4x\") pod \"barbican-api-5c99d8d4cb-fc8zk\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.575581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799q4\" (UniqueName: \"kubernetes.io/projected/6c94a5dc-169c-47ba-a007-307336246c92-kube-api-access-799q4\") pod \"dnsmasq-dns-85ff748b95-zhndc\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.831538 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.832050 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.832608 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.839491 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.873640 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.877881 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.877911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.882290 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.920076 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 16:20:58 crc kubenswrapper[4707]: I1127 16:20:58.933478 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 16:20:59 crc kubenswrapper[4707]: I1127 16:20:59.565061 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:59 crc kubenswrapper[4707]: I1127 16:20:59.565121 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 16:20:59 crc kubenswrapper[4707]: I1127 16:20:59.565137 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 16:20:59 crc kubenswrapper[4707]: I1127 16:20:59.565148 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.744245 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-c695745c6-j5ntf"] Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.748027 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.788039 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.788462 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.797263 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c695745c6-j5ntf"] Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.907470 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-config-data-custom\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.907534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1050304d-e51e-4b02-9cec-828bb7d406bf-logs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.907558 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-public-tls-certs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.907725 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-config-data\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.908005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-internal-tls-certs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.908181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v58jm\" (UniqueName: \"kubernetes.io/projected/1050304d-e51e-4b02-9cec-828bb7d406bf-kube-api-access-v58jm\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:00 crc kubenswrapper[4707]: I1127 16:21:00.908318 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-combined-ca-bundle\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.010103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-config-data\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.010242 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-internal-tls-certs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.010327 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v58jm\" (UniqueName: \"kubernetes.io/projected/1050304d-e51e-4b02-9cec-828bb7d406bf-kube-api-access-v58jm\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.010418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-combined-ca-bundle\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.010523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-config-data-custom\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.010585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1050304d-e51e-4b02-9cec-828bb7d406bf-logs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.010623 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-public-tls-certs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.011952 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1050304d-e51e-4b02-9cec-828bb7d406bf-logs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.016881 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-config-data\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.053134 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-internal-tls-certs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.054906 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-combined-ca-bundle\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.055470 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-config-data-custom\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.059728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1050304d-e51e-4b02-9cec-828bb7d406bf-public-tls-certs\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.062962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v58jm\" (UniqueName: \"kubernetes.io/projected/1050304d-e51e-4b02-9cec-828bb7d406bf-kube-api-access-v58jm\") pod \"barbican-api-c695745c6-j5ntf\" (UID: \"1050304d-e51e-4b02-9cec-828bb7d406bf\") " pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.108824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.492295 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.580172 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.611108 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.611229 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.676018 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 16:21:01 crc kubenswrapper[4707]: I1127 16:21:01.676408 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.298390 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.440680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-sb\") pod \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.440961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-nb\") pod \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.441081 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-svc\") pod \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.441128 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ntt5\" (UniqueName: \"kubernetes.io/projected/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-kube-api-access-4ntt5\") pod \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.441165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-config\") pod \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.441186 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-swift-storage-0\") pod \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\" (UID: \"0f3d64e0-6a2f-49d8-9f92-9ea645023db5\") " Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.447737 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c948b485d-2wcq8"] Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.458886 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-kube-api-access-4ntt5" (OuterVolumeSpecName: "kube-api-access-4ntt5") pod "0f3d64e0-6a2f-49d8-9f92-9ea645023db5" (UID: "0f3d64e0-6a2f-49d8-9f92-9ea645023db5"). InnerVolumeSpecName "kube-api-access-4ntt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.546481 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ntt5\" (UniqueName: \"kubernetes.io/projected/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-kube-api-access-4ntt5\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.603735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" event={"ID":"0f3d64e0-6a2f-49d8-9f92-9ea645023db5","Type":"ContainerDied","Data":"06d3cd6a7e9b415fd70045baf9c296878d01e5fdc901d5c3d6afa7c94b5716cf"} Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.603786 4707 scope.go:117] "RemoveContainer" containerID="e3ee6089f2545043c5b9019c8751c5b067980cc5243bc0f7e5c9b3dfcc5bab80" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.603903 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.612219 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" event={"ID":"d1df02f7-1e71-4fae-a6df-cb3c83460a7e","Type":"ContainerStarted","Data":"04787b09ec5954622487ad0c896fb4eed9b8dc8ce21366516c3c0fa2e490f145"} Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.623467 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b989k" event={"ID":"d6b92c04-fcc0-4d96-8200-3edd228dd326","Type":"ContainerStarted","Data":"c7a510a0eee74afcb33a2507a95e9e2102fe22b9967d4c6d39cdba31c6dbff65"} Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.660585 4707 scope.go:117] "RemoveContainer" containerID="f322e48250a951f6df738ce98dad974d7f8a61257d38fd29f1d379f305ab05bd" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.761729 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-567875797-q64rz"] Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.775258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-zhndc"] Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.842517 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69588f8b9-6plc2"] Nov 27 16:21:02 crc kubenswrapper[4707]: W1127 16:21:02.868054 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de06db5_fa17_40e5_a08d_9b7f139b08ed.slice/crio-f41444e1f5d0e8d9115e4269294798585289ba4057506f1e8d29fba8ca271d39 WatchSource:0}: Error finding container f41444e1f5d0e8d9115e4269294798585289ba4057506f1e8d29fba8ca271d39: Status 404 returned error can't find the container with id f41444e1f5d0e8d9115e4269294798585289ba4057506f1e8d29fba8ca271d39 Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.872418 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0f3d64e0-6a2f-49d8-9f92-9ea645023db5" (UID: "0f3d64e0-6a2f-49d8-9f92-9ea645023db5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.890257 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f3d64e0-6a2f-49d8-9f92-9ea645023db5" (UID: "0f3d64e0-6a2f-49d8-9f92-9ea645023db5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.890844 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f3d64e0-6a2f-49d8-9f92-9ea645023db5" (UID: "0f3d64e0-6a2f-49d8-9f92-9ea645023db5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.891042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-config" (OuterVolumeSpecName: "config") pod "0f3d64e0-6a2f-49d8-9f92-9ea645023db5" (UID: "0f3d64e0-6a2f-49d8-9f92-9ea645023db5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.895737 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f3d64e0-6a2f-49d8-9f92-9ea645023db5" (UID: "0f3d64e0-6a2f-49d8-9f92-9ea645023db5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.977225 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.977247 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.977257 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.977266 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.977274 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f3d64e0-6a2f-49d8-9f92-9ea645023db5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:02 crc kubenswrapper[4707]: I1127 16:21:02.994442 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c695745c6-j5ntf"] Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.005661 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnnrf"] Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.022469 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tnnrf"] Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.045258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85c75486d4-pmkdv"] Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.125542 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c99d8d4cb-fc8zk"] Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.205073 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" path="/var/lib/kubelet/pods/0f3d64e0-6a2f-49d8-9f92-9ea645023db5/volumes" Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.635103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69588f8b9-6plc2" event={"ID":"5de06db5-fa17-40e5-a08d-9b7f139b08ed","Type":"ContainerStarted","Data":"4cb23d0e9ad6de48a440d48d2d2c68c0ff75c4b883cca4e11f2c7ba2f30cc673"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.635893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69588f8b9-6plc2" event={"ID":"5de06db5-fa17-40e5-a08d-9b7f139b08ed","Type":"ContainerStarted","Data":"f41444e1f5d0e8d9115e4269294798585289ba4057506f1e8d29fba8ca271d39"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.637660 4707 generic.go:334] "Generic (PLEG): container finished" podID="6c94a5dc-169c-47ba-a007-307336246c92" containerID="bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8" exitCode=0 Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.638497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" event={"ID":"6c94a5dc-169c-47ba-a007-307336246c92","Type":"ContainerDied","Data":"bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.638523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" event={"ID":"6c94a5dc-169c-47ba-a007-307336246c92","Type":"ContainerStarted","Data":"09b4e62750c919c8e8abfe7d2db43ed3ce8557fd6400f25a0bc2f99c9071316e"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.647296 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6b92c04-fcc0-4d96-8200-3edd228dd326" containerID="c7a510a0eee74afcb33a2507a95e9e2102fe22b9967d4c6d39cdba31c6dbff65" exitCode=0 Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.647523 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b989k" event={"ID":"d6b92c04-fcc0-4d96-8200-3edd228dd326","Type":"ContainerDied","Data":"c7a510a0eee74afcb33a2507a95e9e2102fe22b9967d4c6d39cdba31c6dbff65"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.653331 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hmbzj" event={"ID":"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd","Type":"ContainerStarted","Data":"60a448fb7d203b6dd28abfaf4bdcaa8d82a224071ef2206cfa3fc5917f6c6ebf"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.671612 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85c75486d4-pmkdv" event={"ID":"af1b5c91-e184-49fe-9ad9-83f047d5123d","Type":"ContainerStarted","Data":"8079ccef2e64462ac485969d28656bd798274ff2ca37cc176d76e8c5fe973750"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.671653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85c75486d4-pmkdv" event={"ID":"af1b5c91-e184-49fe-9ad9-83f047d5123d","Type":"ContainerStarted","Data":"7f98e2290e728fb47d8409916290a61ac5a45eb8c1a9ee2ca02066360b456f94"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.672392 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.694465 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerStarted","Data":"a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.696317 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hmbzj" podStartSLOduration=3.33356985 podStartE2EDuration="43.696302775s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="2025-11-27 16:20:21.941995049 +0000 UTC m=+997.573443817" lastFinishedPulling="2025-11-27 16:21:02.304727974 +0000 UTC m=+1037.936176742" observedRunningTime="2025-11-27 16:21:03.694725308 +0000 UTC m=+1039.326174076" watchObservedRunningTime="2025-11-27 16:21:03.696302775 +0000 UTC m=+1039.327751533" Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.699097 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c695745c6-j5ntf" event={"ID":"1050304d-e51e-4b02-9cec-828bb7d406bf","Type":"ContainerStarted","Data":"b22e8ef904ac1b9871dc697fc900506b401d8b483598a62012ec69ae191e11bc"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.699240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c695745c6-j5ntf" event={"ID":"1050304d-e51e-4b02-9cec-828bb7d406bf","Type":"ContainerStarted","Data":"5a12046aca24002cd8aeb338fea59685471153fe7acc51116532c05f9b2fc62d"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.709825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-567875797-q64rz" event={"ID":"e1955b85-6ed8-492c-8001-c4fc20da8270","Type":"ContainerStarted","Data":"2c8097ef45b9e1392460e5ace1f3877e344504149b8a487dcbb4986fdc8a23ab"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.721105 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" event={"ID":"798010e9-c58d-45c7-a41c-d9ff9693d662","Type":"ContainerStarted","Data":"dbb38cca291dad639ab23c03c60a193e2677486cb8d469ea4bc276fd629ebdc8"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.721253 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" event={"ID":"798010e9-c58d-45c7-a41c-d9ff9693d662","Type":"ContainerStarted","Data":"2411abe459a06828785c413e84f05046300c64488e1e857de505325041c1c31f"} Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.722171 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.722318 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.723191 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-85c75486d4-pmkdv" podStartSLOduration=5.723172501 podStartE2EDuration="5.723172501s" podCreationTimestamp="2025-11-27 16:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:03.708901868 +0000 UTC m=+1039.340350636" watchObservedRunningTime="2025-11-27 16:21:03.723172501 +0000 UTC m=+1039.354621269" Nov 27 16:21:03 crc kubenswrapper[4707]: I1127 16:21:03.753612 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" podStartSLOduration=5.753594592 podStartE2EDuration="5.753594592s" podCreationTimestamp="2025-11-27 16:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:03.743327306 +0000 UTC m=+1039.374776074" watchObservedRunningTime="2025-11-27 16:21:03.753594592 +0000 UTC m=+1039.385043350" Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.733814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69588f8b9-6plc2" event={"ID":"5de06db5-fa17-40e5-a08d-9b7f139b08ed","Type":"ContainerStarted","Data":"8985cf374ac05a92646033918d64fdf604bcc1129425390d8bd9d2f5217ea29a"} Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.733970 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.735862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" event={"ID":"6c94a5dc-169c-47ba-a007-307336246c92","Type":"ContainerStarted","Data":"f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b"} Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.735980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.737574 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" event={"ID":"798010e9-c58d-45c7-a41c-d9ff9693d662","Type":"ContainerStarted","Data":"c7ed28dc80289baca53a3a26c17251d17b53f53292ca56fb1090b413baeaf88c"} Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.743765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c695745c6-j5ntf" event={"ID":"1050304d-e51e-4b02-9cec-828bb7d406bf","Type":"ContainerStarted","Data":"c0909c8b609309adab0dc82c67e6f3a8a5e1080cc2ddabf946aa6a38e83a1c76"} Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.743796 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.743814 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.762451 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69588f8b9-6plc2" podStartSLOduration=10.762430147 podStartE2EDuration="10.762430147s" podCreationTimestamp="2025-11-27 16:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:04.751173656 +0000 UTC m=+1040.382622424" watchObservedRunningTime="2025-11-27 16:21:04.762430147 +0000 UTC m=+1040.393878925" Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.793883 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-c695745c6-j5ntf" podStartSLOduration=4.793854382 podStartE2EDuration="4.793854382s" podCreationTimestamp="2025-11-27 16:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:04.787204622 +0000 UTC m=+1040.418653400" watchObservedRunningTime="2025-11-27 16:21:04.793854382 +0000 UTC m=+1040.425303150" Nov 27 16:21:04 crc kubenswrapper[4707]: I1127 16:21:04.819761 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" podStartSLOduration=6.819744904 podStartE2EDuration="6.819744904s" podCreationTimestamp="2025-11-27 16:20:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:04.812129301 +0000 UTC m=+1040.443578089" watchObservedRunningTime="2025-11-27 16:21:04.819744904 +0000 UTC m=+1040.451193662" Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.750358 4707 generic.go:334] "Generic (PLEG): container finished" podID="6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" containerID="60a448fb7d203b6dd28abfaf4bdcaa8d82a224071ef2206cfa3fc5917f6c6ebf" exitCode=0 Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.750388 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hmbzj" event={"ID":"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd","Type":"ContainerDied","Data":"60a448fb7d203b6dd28abfaf4bdcaa8d82a224071ef2206cfa3fc5917f6c6ebf"} Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.752784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-567875797-q64rz" event={"ID":"e1955b85-6ed8-492c-8001-c4fc20da8270","Type":"ContainerStarted","Data":"35f40dc0d759f8ff05c17c238cadcf69cfe29447ee505e072ce2e3815f97a96b"} Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.752856 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-567875797-q64rz" event={"ID":"e1955b85-6ed8-492c-8001-c4fc20da8270","Type":"ContainerStarted","Data":"b21bf301b180a8451d36d199260301055fe7294415dce1ccce84e8bed78b1724"} Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.755280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b989k" event={"ID":"d6b92c04-fcc0-4d96-8200-3edd228dd326","Type":"ContainerStarted","Data":"13690d6c8603b8b998280dcd695c37b24d6aaf45637d17913005170bb5a75c9f"} Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.759681 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" event={"ID":"d1df02f7-1e71-4fae-a6df-cb3c83460a7e","Type":"ContainerStarted","Data":"33b5d306385949b9e009925ccbb99eeda1a38d9014d3385853390e141ed28722"} Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.759707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" event={"ID":"d1df02f7-1e71-4fae-a6df-cb3c83460a7e","Type":"ContainerStarted","Data":"93461d2514be87c2ad379eb12d380e26dac772652d7cac0fedcaa64b24e11771"} Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.782842 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-567875797-q64rz" podStartSLOduration=5.6809732969999995 podStartE2EDuration="7.782821469s" podCreationTimestamp="2025-11-27 16:20:58 +0000 UTC" firstStartedPulling="2025-11-27 16:21:02.794405371 +0000 UTC m=+1038.425854139" lastFinishedPulling="2025-11-27 16:21:04.896253533 +0000 UTC m=+1040.527702311" observedRunningTime="2025-11-27 16:21:05.775410971 +0000 UTC m=+1041.406859739" watchObservedRunningTime="2025-11-27 16:21:05.782821469 +0000 UTC m=+1041.414270227" Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.820106 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6c948b485d-2wcq8" podStartSLOduration=5.436309698 podStartE2EDuration="7.820074334s" podCreationTimestamp="2025-11-27 16:20:58 +0000 UTC" firstStartedPulling="2025-11-27 16:21:02.476100592 +0000 UTC m=+1038.107549360" lastFinishedPulling="2025-11-27 16:21:04.859865228 +0000 UTC m=+1040.491313996" observedRunningTime="2025-11-27 16:21:05.790688588 +0000 UTC m=+1041.422137356" watchObservedRunningTime="2025-11-27 16:21:05.820074334 +0000 UTC m=+1041.451523102" Nov 27 16:21:05 crc kubenswrapper[4707]: I1127 16:21:05.825301 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b989k" podStartSLOduration=22.328861824 podStartE2EDuration="36.82529221s" podCreationTimestamp="2025-11-27 16:20:29 +0000 UTC" firstStartedPulling="2025-11-27 16:20:50.399681773 +0000 UTC m=+1026.031130541" lastFinishedPulling="2025-11-27 16:21:04.896112159 +0000 UTC m=+1040.527560927" observedRunningTime="2025-11-27 16:21:05.804859989 +0000 UTC m=+1041.436308757" watchObservedRunningTime="2025-11-27 16:21:05.82529221 +0000 UTC m=+1041.456740978" Nov 27 16:21:06 crc kubenswrapper[4707]: I1127 16:21:06.772028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5k4zv" event={"ID":"1c445b1d-7e63-48cc-83ee-c4841074701c","Type":"ContainerStarted","Data":"c65c0b619b3ced482b63ee64a2e4708e62b90c0f87f26da5cd3143f9b28bdda2"} Nov 27 16:21:06 crc kubenswrapper[4707]: I1127 16:21:06.775251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gb8rd" event={"ID":"1b8ce5ef-c2af-4e15-9677-8e878b96c4de","Type":"ContainerStarted","Data":"7c9ca57692645ecd6590f8d90bf57e228ca418c5d7aa57dfcd502d32f6aa8032"} Nov 27 16:21:06 crc kubenswrapper[4707]: I1127 16:21:06.797406 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-5k4zv" podStartSLOduration=2.169347783 podStartE2EDuration="46.797360509s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="2025-11-27 16:20:21.443790918 +0000 UTC m=+997.075239686" lastFinishedPulling="2025-11-27 16:21:06.071803644 +0000 UTC m=+1041.703252412" observedRunningTime="2025-11-27 16:21:06.789771147 +0000 UTC m=+1042.421219905" watchObservedRunningTime="2025-11-27 16:21:06.797360509 +0000 UTC m=+1042.428809287" Nov 27 16:21:06 crc kubenswrapper[4707]: I1127 16:21:06.815095 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gb8rd" podStartSLOduration=3.739983705 podStartE2EDuration="46.815080595s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="2025-11-27 16:20:21.824212706 +0000 UTC m=+997.455661474" lastFinishedPulling="2025-11-27 16:21:04.899309596 +0000 UTC m=+1040.530758364" observedRunningTime="2025-11-27 16:21:06.810765672 +0000 UTC m=+1042.442214440" watchObservedRunningTime="2025-11-27 16:21:06.815080595 +0000 UTC m=+1042.446529363" Nov 27 16:21:06 crc kubenswrapper[4707]: I1127 16:21:06.989618 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-tnnrf" podUID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.163203 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hmbzj" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.264941 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-logs\") pod \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.265088 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82f4b\" (UniqueName: \"kubernetes.io/projected/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-kube-api-access-82f4b\") pod \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.265152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-config-data\") pod \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.265179 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-scripts\") pod \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.265206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-combined-ca-bundle\") pod \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\" (UID: \"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd\") " Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.266381 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-logs" (OuterVolumeSpecName: "logs") pod "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" (UID: "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.271935 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-scripts" (OuterVolumeSpecName: "scripts") pod "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" (UID: "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.274543 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-kube-api-access-82f4b" (OuterVolumeSpecName: "kube-api-access-82f4b") pod "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" (UID: "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd"). InnerVolumeSpecName "kube-api-access-82f4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.294253 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-config-data" (OuterVolumeSpecName: "config-data") pod "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" (UID: "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.301459 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" (UID: "6661447e-fc28-4c9a-bcd1-66e15b0ca3fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.366892 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.366924 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82f4b\" (UniqueName: \"kubernetes.io/projected/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-kube-api-access-82f4b\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.366934 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.366943 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.366951 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.797417 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hmbzj" event={"ID":"6661447e-fc28-4c9a-bcd1-66e15b0ca3fd","Type":"ContainerDied","Data":"2f0f8b8b2b5b36a15b7e8a4c8da3f4e812cca11f2f3d832159c3c34b339e4bb0"} Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.797456 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f0f8b8b2b5b36a15b7e8a4c8da3f4e812cca11f2f3d832159c3c34b339e4bb0" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.797479 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hmbzj" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.903152 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-69b8fb6b88-w6pxv"] Nov 27 16:21:07 crc kubenswrapper[4707]: E1127 16:21:07.903806 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerName="init" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.903902 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerName="init" Nov 27 16:21:07 crc kubenswrapper[4707]: E1127 16:21:07.903994 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerName="dnsmasq-dns" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.905535 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerName="dnsmasq-dns" Nov 27 16:21:07 crc kubenswrapper[4707]: E1127 16:21:07.905615 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" containerName="placement-db-sync" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.905689 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" containerName="placement-db-sync" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.905948 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3d64e0-6a2f-49d8-9f92-9ea645023db5" containerName="dnsmasq-dns" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.906038 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" containerName="placement-db-sync" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.906996 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.920562 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.920957 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.921252 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-vw7rm" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.922473 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.922664 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.940993 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69b8fb6b88-w6pxv"] Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.977492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-config-data\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.977561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsd2p\" (UniqueName: \"kubernetes.io/projected/9246eafb-e806-45a4-bc87-9a7724b7467c-kube-api-access-hsd2p\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.977590 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-combined-ca-bundle\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.977637 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-scripts\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.977660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-internal-tls-certs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.977683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9246eafb-e806-45a4-bc87-9a7724b7467c-logs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:07 crc kubenswrapper[4707]: I1127 16:21:07.977708 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-public-tls-certs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.079061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-config-data\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.079438 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsd2p\" (UniqueName: \"kubernetes.io/projected/9246eafb-e806-45a4-bc87-9a7724b7467c-kube-api-access-hsd2p\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.079464 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-combined-ca-bundle\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.079507 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-scripts\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.079525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-internal-tls-certs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.079547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9246eafb-e806-45a4-bc87-9a7724b7467c-logs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.079571 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-public-tls-certs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.080644 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9246eafb-e806-45a4-bc87-9a7724b7467c-logs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.085323 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-combined-ca-bundle\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.085897 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-internal-tls-certs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.086166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-scripts\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.087663 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-public-tls-certs\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.095112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9246eafb-e806-45a4-bc87-9a7724b7467c-config-data\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.109582 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsd2p\" (UniqueName: \"kubernetes.io/projected/9246eafb-e806-45a4-bc87-9a7724b7467c-kube-api-access-hsd2p\") pod \"placement-69b8fb6b88-w6pxv\" (UID: \"9246eafb-e806-45a4-bc87-9a7724b7467c\") " pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.240094 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.842623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.930959 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cbftg"] Nov 27 16:21:08 crc kubenswrapper[4707]: I1127 16:21:08.931213 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" podUID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerName="dnsmasq-dns" containerID="cri-o://7d261fcd6ef13459b3a61a1ccfa03a9eb1612fb9e8ff130111490a131872ecca" gracePeriod=10 Nov 27 16:21:09 crc kubenswrapper[4707]: I1127 16:21:09.407212 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b989k" Nov 27 16:21:09 crc kubenswrapper[4707]: I1127 16:21:09.407432 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b989k" Nov 27 16:21:09 crc kubenswrapper[4707]: I1127 16:21:09.819542 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerID="7d261fcd6ef13459b3a61a1ccfa03a9eb1612fb9e8ff130111490a131872ecca" exitCode=0 Nov 27 16:21:09 crc kubenswrapper[4707]: I1127 16:21:09.819594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" event={"ID":"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82","Type":"ContainerDied","Data":"7d261fcd6ef13459b3a61a1ccfa03a9eb1612fb9e8ff130111490a131872ecca"} Nov 27 16:21:10 crc kubenswrapper[4707]: I1127 16:21:10.361259 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 16:21:10 crc kubenswrapper[4707]: I1127 16:21:10.365729 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:21:10 crc kubenswrapper[4707]: I1127 16:21:10.455531 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b989k" podUID="d6b92c04-fcc0-4d96-8200-3edd228dd326" containerName="registry-server" probeResult="failure" output=< Nov 27 16:21:10 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Nov 27 16:21:10 crc kubenswrapper[4707]: > Nov 27 16:21:11 crc kubenswrapper[4707]: I1127 16:21:11.797799 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:21:11 crc kubenswrapper[4707]: I1127 16:21:11.865601 4707 generic.go:334] "Generic (PLEG): container finished" podID="1c445b1d-7e63-48cc-83ee-c4841074701c" containerID="c65c0b619b3ced482b63ee64a2e4708e62b90c0f87f26da5cd3143f9b28bdda2" exitCode=0 Nov 27 16:21:11 crc kubenswrapper[4707]: I1127 16:21:11.865648 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5k4zv" event={"ID":"1c445b1d-7e63-48cc-83ee-c4841074701c","Type":"ContainerDied","Data":"c65c0b619b3ced482b63ee64a2e4708e62b90c0f87f26da5cd3143f9b28bdda2"} Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.050249 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-69b8fb6b88-w6pxv"] Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.497777 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.525082 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c695745c6-j5ntf" Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.599728 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c99d8d4cb-fc8zk"] Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.600090 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api-log" containerID="cri-o://dbb38cca291dad639ab23c03c60a193e2677486cb8d469ea4bc276fd629ebdc8" gracePeriod=30 Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.600310 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api" containerID="cri-o://c7ed28dc80289baca53a3a26c17251d17b53f53292ca56fb1090b413baeaf88c" gracePeriod=30 Nov 27 16:21:12 crc kubenswrapper[4707]: W1127 16:21:12.802520 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9246eafb_e806_45a4_bc87_9a7724b7467c.slice/crio-155ea1e2c97e506dcd2211b9d7c0c4aa29151a04d24b52b103a2d78419f399c3 WatchSource:0}: Error finding container 155ea1e2c97e506dcd2211b9d7c0c4aa29151a04d24b52b103a2d78419f399c3: Status 404 returned error can't find the container with id 155ea1e2c97e506dcd2211b9d7c0c4aa29151a04d24b52b103a2d78419f399c3 Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.878436 4707 generic.go:334] "Generic (PLEG): container finished" podID="1b8ce5ef-c2af-4e15-9677-8e878b96c4de" containerID="7c9ca57692645ecd6590f8d90bf57e228ca418c5d7aa57dfcd502d32f6aa8032" exitCode=0 Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.878668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gb8rd" event={"ID":"1b8ce5ef-c2af-4e15-9677-8e878b96c4de","Type":"ContainerDied","Data":"7c9ca57692645ecd6590f8d90bf57e228ca418c5d7aa57dfcd502d32f6aa8032"} Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.881947 4707 generic.go:334] "Generic (PLEG): container finished" podID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerID="dbb38cca291dad639ab23c03c60a193e2677486cb8d469ea4bc276fd629ebdc8" exitCode=143 Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.881990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" event={"ID":"798010e9-c58d-45c7-a41c-d9ff9693d662","Type":"ContainerDied","Data":"dbb38cca291dad639ab23c03c60a193e2677486cb8d469ea4bc276fd629ebdc8"} Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.883224 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69b8fb6b88-w6pxv" event={"ID":"9246eafb-e806-45a4-bc87-9a7724b7467c","Type":"ContainerStarted","Data":"155ea1e2c97e506dcd2211b9d7c0c4aa29151a04d24b52b103a2d78419f399c3"} Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.885774 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" event={"ID":"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82","Type":"ContainerDied","Data":"fdcf821ca62a141bc378b0d6287b60f1ed9d2e669ce1e27371313ac31de366be"} Nov 27 16:21:12 crc kubenswrapper[4707]: I1127 16:21:12.885803 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdcf821ca62a141bc378b0d6287b60f1ed9d2e669ce1e27371313ac31de366be" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.102089 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.177270 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5k4zv" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.201932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-nb\") pod \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.201989 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-svc\") pod \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.202024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-swift-storage-0\") pod \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.202165 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-config\") pod \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.202206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-sb\") pod \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.202262 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2hd7\" (UniqueName: \"kubernetes.io/projected/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-kube-api-access-t2hd7\") pod \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\" (UID: \"b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.222804 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-kube-api-access-t2hd7" (OuterVolumeSpecName: "kube-api-access-t2hd7") pod "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" (UID: "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82"). InnerVolumeSpecName "kube-api-access-t2hd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.262302 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" (UID: "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.290166 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" (UID: "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.291807 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" (UID: "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.303888 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5j7f\" (UniqueName: \"kubernetes.io/projected/1c445b1d-7e63-48cc-83ee-c4841074701c-kube-api-access-p5j7f\") pod \"1c445b1d-7e63-48cc-83ee-c4841074701c\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.303988 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-config-data\") pod \"1c445b1d-7e63-48cc-83ee-c4841074701c\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.304127 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-combined-ca-bundle\") pod \"1c445b1d-7e63-48cc-83ee-c4841074701c\" (UID: \"1c445b1d-7e63-48cc-83ee-c4841074701c\") " Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.305001 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.305035 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2hd7\" (UniqueName: \"kubernetes.io/projected/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-kube-api-access-t2hd7\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.305045 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.305053 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.307430 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c445b1d-7e63-48cc-83ee-c4841074701c-kube-api-access-p5j7f" (OuterVolumeSpecName: "kube-api-access-p5j7f") pod "1c445b1d-7e63-48cc-83ee-c4841074701c" (UID: "1c445b1d-7e63-48cc-83ee-c4841074701c"). InnerVolumeSpecName "kube-api-access-p5j7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.326934 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c445b1d-7e63-48cc-83ee-c4841074701c" (UID: "1c445b1d-7e63-48cc-83ee-c4841074701c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.333571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" (UID: "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.337750 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-config" (OuterVolumeSpecName: "config") pod "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" (UID: "b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.392591 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-config-data" (OuterVolumeSpecName: "config-data") pod "1c445b1d-7e63-48cc-83ee-c4841074701c" (UID: "1c445b1d-7e63-48cc-83ee-c4841074701c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.407735 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.407763 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5j7f\" (UniqueName: \"kubernetes.io/projected/1c445b1d-7e63-48cc-83ee-c4841074701c-kube-api-access-p5j7f\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.407774 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.407783 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c445b1d-7e63-48cc-83ee-c4841074701c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.407795 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.896876 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-5k4zv" event={"ID":"1c445b1d-7e63-48cc-83ee-c4841074701c","Type":"ContainerDied","Data":"093212c8f9087466fb96d6056d794b2a38801f2e8dabba563d000a26d1beec2b"} Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.896956 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="093212c8f9087466fb96d6056d794b2a38801f2e8dabba563d000a26d1beec2b" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.896914 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-5k4zv" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.900322 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69b8fb6b88-w6pxv" event={"ID":"9246eafb-e806-45a4-bc87-9a7724b7467c","Type":"ContainerStarted","Data":"6777b57e984e5a3cd69f7de999fe216d71db7d0118ad975d31ff5c1b8c26a004"} Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.900410 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-69b8fb6b88-w6pxv" event={"ID":"9246eafb-e806-45a4-bc87-9a7724b7467c","Type":"ContainerStarted","Data":"d33f3c87917c6f0ac77de827b24ea583dedc8f0341e53cac536e7ff60a0dad31"} Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.900535 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.902717 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerStarted","Data":"de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb"} Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.902874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.903150 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="ceilometer-central-agent" containerID="cri-o://b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2" gracePeriod=30 Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.903278 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="proxy-httpd" containerID="cri-o://de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb" gracePeriod=30 Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.903365 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="ceilometer-notification-agent" containerID="cri-o://350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78" gracePeriod=30 Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.903526 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="sg-core" containerID="cri-o://a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894" gracePeriod=30 Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.951656 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-69b8fb6b88-w6pxv" podStartSLOduration=6.95163635 podStartE2EDuration="6.95163635s" podCreationTimestamp="2025-11-27 16:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:13.9308508 +0000 UTC m=+1049.562299568" watchObservedRunningTime="2025-11-27 16:21:13.95163635 +0000 UTC m=+1049.583085118" Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.965682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cbftg"] Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.976458 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-cbftg"] Nov 27 16:21:13 crc kubenswrapper[4707]: I1127 16:21:13.982309 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.910105032 podStartE2EDuration="53.982289977s" podCreationTimestamp="2025-11-27 16:20:20 +0000 UTC" firstStartedPulling="2025-11-27 16:20:21.859712529 +0000 UTC m=+997.491161287" lastFinishedPulling="2025-11-27 16:21:12.931897464 +0000 UTC m=+1048.563346232" observedRunningTime="2025-11-27 16:21:13.972758978 +0000 UTC m=+1049.604207746" watchObservedRunningTime="2025-11-27 16:21:13.982289977 +0000 UTC m=+1049.613738745" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.283789 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.324057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-etc-machine-id\") pod \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.324111 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1b8ce5ef-c2af-4e15-9677-8e878b96c4de" (UID: "1b8ce5ef-c2af-4e15-9677-8e878b96c4de"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.324204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-combined-ca-bundle\") pod \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.324288 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-config-data\") pod \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.324332 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-db-sync-config-data\") pod \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.324354 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-scripts\") pod \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.324401 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts9jh\" (UniqueName: \"kubernetes.io/projected/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-kube-api-access-ts9jh\") pod \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\" (UID: \"1b8ce5ef-c2af-4e15-9677-8e878b96c4de\") " Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.324718 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.329338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-scripts" (OuterVolumeSpecName: "scripts") pod "1b8ce5ef-c2af-4e15-9677-8e878b96c4de" (UID: "1b8ce5ef-c2af-4e15-9677-8e878b96c4de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.329586 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1b8ce5ef-c2af-4e15-9677-8e878b96c4de" (UID: "1b8ce5ef-c2af-4e15-9677-8e878b96c4de"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.334959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-kube-api-access-ts9jh" (OuterVolumeSpecName: "kube-api-access-ts9jh") pod "1b8ce5ef-c2af-4e15-9677-8e878b96c4de" (UID: "1b8ce5ef-c2af-4e15-9677-8e878b96c4de"). InnerVolumeSpecName "kube-api-access-ts9jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.347690 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b8ce5ef-c2af-4e15-9677-8e878b96c4de" (UID: "1b8ce5ef-c2af-4e15-9677-8e878b96c4de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.380123 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-config-data" (OuterVolumeSpecName: "config-data") pod "1b8ce5ef-c2af-4e15-9677-8e878b96c4de" (UID: "1b8ce5ef-c2af-4e15-9677-8e878b96c4de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.426341 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.426564 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.426661 4707 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.426735 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.426801 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts9jh\" (UniqueName: \"kubernetes.io/projected/1b8ce5ef-c2af-4e15-9677-8e878b96c4de-kube-api-access-ts9jh\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.933121 4707 generic.go:334] "Generic (PLEG): container finished" podID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerID="de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb" exitCode=0 Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.933526 4707 generic.go:334] "Generic (PLEG): container finished" podID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerID="a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894" exitCode=2 Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.933194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerDied","Data":"de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb"} Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.933582 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerDied","Data":"a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894"} Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.933604 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerDied","Data":"b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2"} Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.933540 4707 generic.go:334] "Generic (PLEG): container finished" podID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerID="b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2" exitCode=0 Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.941232 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gb8rd" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.941466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gb8rd" event={"ID":"1b8ce5ef-c2af-4e15-9677-8e878b96c4de","Type":"ContainerDied","Data":"8b3b6fef5c666be483eb6a325fc48ed52bafcabaf875b0753de1763448512eb8"} Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.942777 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b3b6fef5c666be483eb6a325fc48ed52bafcabaf875b0753de1763448512eb8" Nov 27 16:21:14 crc kubenswrapper[4707]: I1127 16:21:14.942811 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.140588 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:15 crc kubenswrapper[4707]: E1127 16:21:15.141034 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerName="init" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.141095 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerName="init" Nov 27 16:21:15 crc kubenswrapper[4707]: E1127 16:21:15.141159 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8ce5ef-c2af-4e15-9677-8e878b96c4de" containerName="cinder-db-sync" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.141206 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8ce5ef-c2af-4e15-9677-8e878b96c4de" containerName="cinder-db-sync" Nov 27 16:21:15 crc kubenswrapper[4707]: E1127 16:21:15.141271 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerName="dnsmasq-dns" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.141317 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerName="dnsmasq-dns" Nov 27 16:21:15 crc kubenswrapper[4707]: E1127 16:21:15.141410 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c445b1d-7e63-48cc-83ee-c4841074701c" containerName="heat-db-sync" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.141458 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c445b1d-7e63-48cc-83ee-c4841074701c" containerName="heat-db-sync" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.141665 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8ce5ef-c2af-4e15-9677-8e878b96c4de" containerName="cinder-db-sync" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.141720 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c445b1d-7e63-48cc-83ee-c4841074701c" containerName="heat-db-sync" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.141778 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerName="dnsmasq-dns" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.142708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.144563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.154780 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.154866 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.155271 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-89cbx" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.163552 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.191769 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-t8jp6"] Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.193328 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.209970 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" path="/var/lib/kubelet/pods/b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82/volumes" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.213621 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-t8jp6"] Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239533 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239595 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-scripts\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239620 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkcsq\" (UniqueName: \"kubernetes.io/projected/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-kube-api-access-fkcsq\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239660 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13a70ad2-b029-45d8-959e-db6823a68b6a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-config\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239749 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rdjd\" (UniqueName: \"kubernetes.io/projected/13a70ad2-b029-45d8-959e-db6823a68b6a-kube-api-access-2rdjd\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239808 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.239825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.320938 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.322283 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.323675 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.333285 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341068 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-scripts\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkcsq\" (UniqueName: \"kubernetes.io/projected/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-kube-api-access-fkcsq\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341202 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13a70ad2-b029-45d8-959e-db6823a68b6a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341218 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-config\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341235 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rdjd\" (UniqueName: \"kubernetes.io/projected/13a70ad2-b029-45d8-959e-db6823a68b6a-kube-api-access-2rdjd\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341261 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341282 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.341885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.342060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.342110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.342428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-config\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.342564 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13a70ad2-b029-45d8-959e-db6823a68b6a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.342584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.345464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-scripts\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.345839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.346275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.346964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.361026 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkcsq\" (UniqueName: \"kubernetes.io/projected/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-kube-api-access-fkcsq\") pod \"dnsmasq-dns-5c9776ccc5-t8jp6\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.363768 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rdjd\" (UniqueName: \"kubernetes.io/projected/13a70ad2-b029-45d8-959e-db6823a68b6a-kube-api-access-2rdjd\") pod \"cinder-scheduler-0\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.442284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.442350 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.442386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-scripts\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.442509 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data-custom\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.442633 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2887e9f-7ff8-485d-ac06-0e06095ace94-logs\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.442665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqcl5\" (UniqueName: \"kubernetes.io/projected/b2887e9f-7ff8-485d-ac06-0e06095ace94-kube-api-access-dqcl5\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.442694 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2887e9f-7ff8-485d-ac06-0e06095ace94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.458624 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.516186 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.545129 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.545474 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.545512 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-scripts\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.545563 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data-custom\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.545668 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2887e9f-7ff8-485d-ac06-0e06095ace94-logs\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.545714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqcl5\" (UniqueName: \"kubernetes.io/projected/b2887e9f-7ff8-485d-ac06-0e06095ace94-kube-api-access-dqcl5\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.545775 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2887e9f-7ff8-485d-ac06-0e06095ace94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.545938 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2887e9f-7ff8-485d-ac06-0e06095ace94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.546555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2887e9f-7ff8-485d-ac06-0e06095ace94-logs\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.556863 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.557298 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.559246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data-custom\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.561676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-scripts\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.568198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqcl5\" (UniqueName: \"kubernetes.io/projected/b2887e9f-7ff8-485d-ac06-0e06095ace94-kube-api-access-dqcl5\") pod \"cinder-api-0\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.638436 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.753484 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:48862->10.217.0.155:9311: read: connection reset by peer" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.753541 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:48848->10.217.0.155:9311: read: connection reset by peer" Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.877882 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.951029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13a70ad2-b029-45d8-959e-db6823a68b6a","Type":"ContainerStarted","Data":"76bb8540a5909815f92b6e5dd198fe761f3b685776d2680a0fc823b0dea09bcb"} Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.953888 4707 generic.go:334] "Generic (PLEG): container finished" podID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerID="c7ed28dc80289baca53a3a26c17251d17b53f53292ca56fb1090b413baeaf88c" exitCode=0 Nov 27 16:21:15 crc kubenswrapper[4707]: I1127 16:21:15.954718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" event={"ID":"798010e9-c58d-45c7-a41c-d9ff9693d662","Type":"ContainerDied","Data":"c7ed28dc80289baca53a3a26c17251d17b53f53292ca56fb1090b413baeaf88c"} Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.023309 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-t8jp6"] Nov 27 16:21:16 crc kubenswrapper[4707]: W1127 16:21:16.033213 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc190a4_bb7f_4b8a_acc4_e201fd1d495d.slice/crio-8f3ab2408c96ad3596755d14026d95ab1c785ed09e8d748c2a1915def79901b7 WatchSource:0}: Error finding container 8f3ab2408c96ad3596755d14026d95ab1c785ed09e8d748c2a1915def79901b7: Status 404 returned error can't find the container with id 8f3ab2408c96ad3596755d14026d95ab1c785ed09e8d748c2a1915def79901b7 Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.101364 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-cbftg" podUID="b4ad731f-8b0b-4bd5-bf46-2f9a0a3a0b82" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: i/o timeout" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.178478 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.200060 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.262305 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-combined-ca-bundle\") pod \"798010e9-c58d-45c7-a41c-d9ff9693d662\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.262439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data-custom\") pod \"798010e9-c58d-45c7-a41c-d9ff9693d662\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.262508 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzw4x\" (UniqueName: \"kubernetes.io/projected/798010e9-c58d-45c7-a41c-d9ff9693d662-kube-api-access-pzw4x\") pod \"798010e9-c58d-45c7-a41c-d9ff9693d662\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.263047 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/798010e9-c58d-45c7-a41c-d9ff9693d662-logs\") pod \"798010e9-c58d-45c7-a41c-d9ff9693d662\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.263094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data\") pod \"798010e9-c58d-45c7-a41c-d9ff9693d662\" (UID: \"798010e9-c58d-45c7-a41c-d9ff9693d662\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.264153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/798010e9-c58d-45c7-a41c-d9ff9693d662-logs" (OuterVolumeSpecName: "logs") pod "798010e9-c58d-45c7-a41c-d9ff9693d662" (UID: "798010e9-c58d-45c7-a41c-d9ff9693d662"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.267084 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "798010e9-c58d-45c7-a41c-d9ff9693d662" (UID: "798010e9-c58d-45c7-a41c-d9ff9693d662"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.270475 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798010e9-c58d-45c7-a41c-d9ff9693d662-kube-api-access-pzw4x" (OuterVolumeSpecName: "kube-api-access-pzw4x") pod "798010e9-c58d-45c7-a41c-d9ff9693d662" (UID: "798010e9-c58d-45c7-a41c-d9ff9693d662"). InnerVolumeSpecName "kube-api-access-pzw4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.285923 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "798010e9-c58d-45c7-a41c-d9ff9693d662" (UID: "798010e9-c58d-45c7-a41c-d9ff9693d662"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.319052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data" (OuterVolumeSpecName: "config-data") pod "798010e9-c58d-45c7-a41c-d9ff9693d662" (UID: "798010e9-c58d-45c7-a41c-d9ff9693d662"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.366023 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.366341 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.366351 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzw4x\" (UniqueName: \"kubernetes.io/projected/798010e9-c58d-45c7-a41c-d9ff9693d662-kube-api-access-pzw4x\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.366363 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/798010e9-c58d-45c7-a41c-d9ff9693d662-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.366383 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798010e9-c58d-45c7-a41c-d9ff9693d662-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.593427 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.672389 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spsnj\" (UniqueName: \"kubernetes.io/projected/a49cd9fc-1364-454e-af11-bbf64e43e56d-kube-api-access-spsnj\") pod \"a49cd9fc-1364-454e-af11-bbf64e43e56d\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.672430 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-config-data\") pod \"a49cd9fc-1364-454e-af11-bbf64e43e56d\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.672507 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-run-httpd\") pod \"a49cd9fc-1364-454e-af11-bbf64e43e56d\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.672757 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-combined-ca-bundle\") pod \"a49cd9fc-1364-454e-af11-bbf64e43e56d\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.672791 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-scripts\") pod \"a49cd9fc-1364-454e-af11-bbf64e43e56d\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.672808 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-sg-core-conf-yaml\") pod \"a49cd9fc-1364-454e-af11-bbf64e43e56d\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.672860 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-log-httpd\") pod \"a49cd9fc-1364-454e-af11-bbf64e43e56d\" (UID: \"a49cd9fc-1364-454e-af11-bbf64e43e56d\") " Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.672979 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a49cd9fc-1364-454e-af11-bbf64e43e56d" (UID: "a49cd9fc-1364-454e-af11-bbf64e43e56d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.673294 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.673668 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a49cd9fc-1364-454e-af11-bbf64e43e56d" (UID: "a49cd9fc-1364-454e-af11-bbf64e43e56d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.677192 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-scripts" (OuterVolumeSpecName: "scripts") pod "a49cd9fc-1364-454e-af11-bbf64e43e56d" (UID: "a49cd9fc-1364-454e-af11-bbf64e43e56d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.677701 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49cd9fc-1364-454e-af11-bbf64e43e56d-kube-api-access-spsnj" (OuterVolumeSpecName: "kube-api-access-spsnj") pod "a49cd9fc-1364-454e-af11-bbf64e43e56d" (UID: "a49cd9fc-1364-454e-af11-bbf64e43e56d"). InnerVolumeSpecName "kube-api-access-spsnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.703420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a49cd9fc-1364-454e-af11-bbf64e43e56d" (UID: "a49cd9fc-1364-454e-af11-bbf64e43e56d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.775391 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spsnj\" (UniqueName: \"kubernetes.io/projected/a49cd9fc-1364-454e-af11-bbf64e43e56d-kube-api-access-spsnj\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.775694 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.776586 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.776649 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a49cd9fc-1364-454e-af11-bbf64e43e56d-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.799969 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a49cd9fc-1364-454e-af11-bbf64e43e56d" (UID: "a49cd9fc-1364-454e-af11-bbf64e43e56d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.852846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-config-data" (OuterVolumeSpecName: "config-data") pod "a49cd9fc-1364-454e-af11-bbf64e43e56d" (UID: "a49cd9fc-1364-454e-af11-bbf64e43e56d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.877760 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.877787 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49cd9fc-1364-454e-af11-bbf64e43e56d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.982109 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" containerID="055310bb79129fbd5f9e58df2876727ead3143c79299861fb6c74bacf0d6a460" exitCode=0 Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.982171 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" event={"ID":"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d","Type":"ContainerDied","Data":"055310bb79129fbd5f9e58df2876727ead3143c79299861fb6c74bacf0d6a460"} Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.982198 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" event={"ID":"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d","Type":"ContainerStarted","Data":"8f3ab2408c96ad3596755d14026d95ab1c785ed09e8d748c2a1915def79901b7"} Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.986966 4707 generic.go:334] "Generic (PLEG): container finished" podID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerID="350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78" exitCode=0 Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.987008 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerDied","Data":"350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78"} Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.987027 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a49cd9fc-1364-454e-af11-bbf64e43e56d","Type":"ContainerDied","Data":"c05d454925458f5498885f54d66c0dac45a32892f7880cdb10728ab6946c6a89"} Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.987043 4707 scope.go:117] "RemoveContainer" containerID="de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.987132 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.989814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" event={"ID":"798010e9-c58d-45c7-a41c-d9ff9693d662","Type":"ContainerDied","Data":"2411abe459a06828785c413e84f05046300c64488e1e857de505325041c1c31f"} Nov 27 16:21:16 crc kubenswrapper[4707]: I1127 16:21:16.989927 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c99d8d4cb-fc8zk" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.006226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2887e9f-7ff8-485d-ac06-0e06095ace94","Type":"ContainerStarted","Data":"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a"} Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.006265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2887e9f-7ff8-485d-ac06-0e06095ace94","Type":"ContainerStarted","Data":"7c21a937d463a3de9386057773d078ffeea55a5f843ec7854c2ef8315be8a692"} Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.114966 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.124986 4707 scope.go:117] "RemoveContainer" containerID="a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.132772 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.169722 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c99d8d4cb-fc8zk"] Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.183826 4707 scope.go:117] "RemoveContainer" containerID="350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.185567 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c99d8d4cb-fc8zk"] Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.208420 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" path="/var/lib/kubelet/pods/798010e9-c58d-45c7-a41c-d9ff9693d662/volumes" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.209176 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" path="/var/lib/kubelet/pods/a49cd9fc-1364-454e-af11-bbf64e43e56d/volumes" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.209985 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.210591 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="sg-core" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.210610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="sg-core" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.210630 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="ceilometer-notification-agent" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.210655 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="ceilometer-notification-agent" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.210672 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.210677 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.210697 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="proxy-httpd" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.210704 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="proxy-httpd" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.210714 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="ceilometer-central-agent" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.210720 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="ceilometer-central-agent" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.210728 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api-log" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.210733 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api-log" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.211000 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="ceilometer-central-agent" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.211021 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.211032 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="proxy-httpd" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.211045 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="ceilometer-notification-agent" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.211058 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49cd9fc-1364-454e-af11-bbf64e43e56d" containerName="sg-core" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.211069 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="798010e9-c58d-45c7-a41c-d9ff9693d662" containerName="barbican-api-log" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.213415 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.213502 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.216466 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.216552 4707 scope.go:117] "RemoveContainer" containerID="b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.217559 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.241667 4707 scope.go:117] "RemoveContainer" containerID="de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.242014 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb\": container with ID starting with de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb not found: ID does not exist" containerID="de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.242054 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb"} err="failed to get container status \"de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb\": rpc error: code = NotFound desc = could not find container \"de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb\": container with ID starting with de2c7846bfbd498d32203314411058c7f5be25226690e571b9f2b6218b7166eb not found: ID does not exist" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.242082 4707 scope.go:117] "RemoveContainer" containerID="a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.242908 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894\": container with ID starting with a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894 not found: ID does not exist" containerID="a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.242971 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894"} err="failed to get container status \"a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894\": rpc error: code = NotFound desc = could not find container \"a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894\": container with ID starting with a9380b9e375b8057c2e3998e8fc15dec436d302837380c797778c768cdb42894 not found: ID does not exist" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.243003 4707 scope.go:117] "RemoveContainer" containerID="350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.243330 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78\": container with ID starting with 350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78 not found: ID does not exist" containerID="350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.243351 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78"} err="failed to get container status \"350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78\": rpc error: code = NotFound desc = could not find container \"350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78\": container with ID starting with 350e127fe2aab6ee8a8903f23ab9c4569f76d251ed26bad9e0a532fba42eed78 not found: ID does not exist" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.243385 4707 scope.go:117] "RemoveContainer" containerID="b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2" Nov 27 16:21:17 crc kubenswrapper[4707]: E1127 16:21:17.243830 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2\": container with ID starting with b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2 not found: ID does not exist" containerID="b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.243856 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2"} err="failed to get container status \"b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2\": rpc error: code = NotFound desc = could not find container \"b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2\": container with ID starting with b682eef24c68d41197dc9690bb0f26acd52e3ebac8dc6099ed14141946c0dfb2 not found: ID does not exist" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.243871 4707 scope.go:117] "RemoveContainer" containerID="c7ed28dc80289baca53a3a26c17251d17b53f53292ca56fb1090b413baeaf88c" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.278639 4707 scope.go:117] "RemoveContainer" containerID="dbb38cca291dad639ab23c03c60a193e2677486cb8d469ea4bc276fd629ebdc8" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.286280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.286311 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcbx4\" (UniqueName: \"kubernetes.io/projected/20cd323d-9f2e-472f-8c75-9b413bbdb303-kube-api-access-kcbx4\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.286341 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-run-httpd\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.286382 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-log-httpd\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.286402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-scripts\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.286444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-config-data\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.286640 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.390408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.390460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbx4\" (UniqueName: \"kubernetes.io/projected/20cd323d-9f2e-472f-8c75-9b413bbdb303-kube-api-access-kcbx4\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.390502 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-run-httpd\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.390547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-log-httpd\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.390572 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-scripts\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.390607 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-config-data\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.390697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.391661 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-log-httpd\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.392732 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-run-httpd\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.394160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.395197 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-scripts\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.399036 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.403259 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-config-data\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.408835 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcbx4\" (UniqueName: \"kubernetes.io/projected/20cd323d-9f2e-472f-8c75-9b413bbdb303-kube-api-access-kcbx4\") pod \"ceilometer-0\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.533525 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:21:17 crc kubenswrapper[4707]: I1127 16:21:17.967996 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.014287 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" event={"ID":"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d","Type":"ContainerStarted","Data":"622e5ed3f0f124984e4d55cbfaa4e258b2bbb08221ab94c92cddb1927c240800"} Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.014418 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.017747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerStarted","Data":"147d1755f0e7d809d92afe9883261d09ff79b624a6e4a02aacdcb813b16d7957"} Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.021052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13a70ad2-b029-45d8-959e-db6823a68b6a","Type":"ContainerStarted","Data":"7063e359332cb179d8d91d30c0eee57acebfa09b276fe0c419ec11711cfc18fe"} Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.021074 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13a70ad2-b029-45d8-959e-db6823a68b6a","Type":"ContainerStarted","Data":"6cd0fefc6b2d8d18ccb7293279fbe2edae89a3cdbe2ac921d1f5e30dc34d7012"} Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.023354 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2887e9f-7ff8-485d-ac06-0e06095ace94","Type":"ContainerStarted","Data":"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449"} Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.023511 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.037669 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" podStartSLOduration=3.037653025 podStartE2EDuration="3.037653025s" podCreationTimestamp="2025-11-27 16:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:18.03581179 +0000 UTC m=+1053.667260568" watchObservedRunningTime="2025-11-27 16:21:18.037653025 +0000 UTC m=+1053.669101793" Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.055678 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.055661387 podStartE2EDuration="3.055661387s" podCreationTimestamp="2025-11-27 16:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:18.048306461 +0000 UTC m=+1053.679755219" watchObservedRunningTime="2025-11-27 16:21:18.055661387 +0000 UTC m=+1053.687110155" Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.066362 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.292028586 podStartE2EDuration="3.066348154s" podCreationTimestamp="2025-11-27 16:21:15 +0000 UTC" firstStartedPulling="2025-11-27 16:21:15.896480119 +0000 UTC m=+1051.527928887" lastFinishedPulling="2025-11-27 16:21:16.670799687 +0000 UTC m=+1052.302248455" observedRunningTime="2025-11-27 16:21:18.065515024 +0000 UTC m=+1053.696963792" watchObservedRunningTime="2025-11-27 16:21:18.066348154 +0000 UTC m=+1053.697796922" Nov 27 16:21:18 crc kubenswrapper[4707]: I1127 16:21:18.314524 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:19 crc kubenswrapper[4707]: I1127 16:21:19.035694 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerStarted","Data":"d81248bbf232c36a76ada9937a510da6bfacd6954797306990a4e5145d9c0e2c"} Nov 27 16:21:19 crc kubenswrapper[4707]: I1127 16:21:19.520991 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b989k" Nov 27 16:21:19 crc kubenswrapper[4707]: I1127 16:21:19.571883 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b989k" Nov 27 16:21:19 crc kubenswrapper[4707]: I1127 16:21:19.647052 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b989k"] Nov 27 16:21:19 crc kubenswrapper[4707]: I1127 16:21:19.762132 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sb7xq"] Nov 27 16:21:19 crc kubenswrapper[4707]: I1127 16:21:19.762427 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sb7xq" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerName="registry-server" containerID="cri-o://7aff2cfb12bf7c388208239d38004d97b94cfff4b0ec19c84f6cd38fab171961" gracePeriod=2 Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.093088 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerStarted","Data":"2cc08f712dd41ada2b39f74e2e2a7cde303e8246eb9d310e11ff71490a4cfdd5"} Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.098659 4707 generic.go:334] "Generic (PLEG): container finished" podID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerID="7aff2cfb12bf7c388208239d38004d97b94cfff4b0ec19c84f6cd38fab171961" exitCode=0 Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.099581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7xq" event={"ID":"a875a4be-fcc1-4b9e-b14e-5f0fc714639c","Type":"ContainerDied","Data":"7aff2cfb12bf7c388208239d38004d97b94cfff4b0ec19c84f6cd38fab171961"} Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.099709 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerName="cinder-api-log" containerID="cri-o://def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a" gracePeriod=30 Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.100899 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerName="cinder-api" containerID="cri-o://b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449" gracePeriod=30 Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.234666 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.354763 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-catalog-content\") pod \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.354859 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg874\" (UniqueName: \"kubernetes.io/projected/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-kube-api-access-rg874\") pod \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.354926 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-utilities\") pod \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\" (UID: \"a875a4be-fcc1-4b9e-b14e-5f0fc714639c\") " Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.355679 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-utilities" (OuterVolumeSpecName: "utilities") pod "a875a4be-fcc1-4b9e-b14e-5f0fc714639c" (UID: "a875a4be-fcc1-4b9e-b14e-5f0fc714639c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.362824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-kube-api-access-rg874" (OuterVolumeSpecName: "kube-api-access-rg874") pod "a875a4be-fcc1-4b9e-b14e-5f0fc714639c" (UID: "a875a4be-fcc1-4b9e-b14e-5f0fc714639c"). InnerVolumeSpecName "kube-api-access-rg874". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.400561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a875a4be-fcc1-4b9e-b14e-5f0fc714639c" (UID: "a875a4be-fcc1-4b9e-b14e-5f0fc714639c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.456350 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.456394 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg874\" (UniqueName: \"kubernetes.io/projected/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-kube-api-access-rg874\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.456406 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a875a4be-fcc1-4b9e-b14e-5f0fc714639c-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:20 crc kubenswrapper[4707]: I1127 16:21:20.459454 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.068635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.107918 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerID="b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449" exitCode=0 Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.107947 4707 generic.go:334] "Generic (PLEG): container finished" podID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerID="def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a" exitCode=143 Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.107992 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.108001 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2887e9f-7ff8-485d-ac06-0e06095ace94","Type":"ContainerDied","Data":"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449"} Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.108029 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2887e9f-7ff8-485d-ac06-0e06095ace94","Type":"ContainerDied","Data":"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a"} Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.108043 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b2887e9f-7ff8-485d-ac06-0e06095ace94","Type":"ContainerDied","Data":"7c21a937d463a3de9386057773d078ffeea55a5f843ec7854c2ef8315be8a692"} Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.108058 4707 scope.go:117] "RemoveContainer" containerID="b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.116028 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7xq" event={"ID":"a875a4be-fcc1-4b9e-b14e-5f0fc714639c","Type":"ContainerDied","Data":"a74f64345f2091826d3c177c70752e45f32902972e49a5da7e62fa7aa44db154"} Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.116056 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7xq" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.118458 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerStarted","Data":"ed7e45af14424d36b8d34d430ff20fe0a559a93ab0e83b7a6e5c3d1973b78ae1"} Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.139095 4707 scope.go:117] "RemoveContainer" containerID="def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.160255 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sb7xq"] Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.168752 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data-custom\") pod \"b2887e9f-7ff8-485d-ac06-0e06095ace94\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.168795 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-combined-ca-bundle\") pod \"b2887e9f-7ff8-485d-ac06-0e06095ace94\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.168867 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqcl5\" (UniqueName: \"kubernetes.io/projected/b2887e9f-7ff8-485d-ac06-0e06095ace94-kube-api-access-dqcl5\") pod \"b2887e9f-7ff8-485d-ac06-0e06095ace94\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.168911 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data\") pod \"b2887e9f-7ff8-485d-ac06-0e06095ace94\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.169002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2887e9f-7ff8-485d-ac06-0e06095ace94-logs\") pod \"b2887e9f-7ff8-485d-ac06-0e06095ace94\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.169053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-scripts\") pod \"b2887e9f-7ff8-485d-ac06-0e06095ace94\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.169090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2887e9f-7ff8-485d-ac06-0e06095ace94-etc-machine-id\") pod \"b2887e9f-7ff8-485d-ac06-0e06095ace94\" (UID: \"b2887e9f-7ff8-485d-ac06-0e06095ace94\") " Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.169651 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2887e9f-7ff8-485d-ac06-0e06095ace94-logs" (OuterVolumeSpecName: "logs") pod "b2887e9f-7ff8-485d-ac06-0e06095ace94" (UID: "b2887e9f-7ff8-485d-ac06-0e06095ace94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.169945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2887e9f-7ff8-485d-ac06-0e06095ace94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b2887e9f-7ff8-485d-ac06-0e06095ace94" (UID: "b2887e9f-7ff8-485d-ac06-0e06095ace94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.170518 4707 scope.go:117] "RemoveContainer" containerID="b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449" Nov 27 16:21:21 crc kubenswrapper[4707]: E1127 16:21:21.176500 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449\": container with ID starting with b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449 not found: ID does not exist" containerID="b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.176537 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449"} err="failed to get container status \"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449\": rpc error: code = NotFound desc = could not find container \"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449\": container with ID starting with b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449 not found: ID does not exist" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.176560 4707 scope.go:117] "RemoveContainer" containerID="def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a" Nov 27 16:21:21 crc kubenswrapper[4707]: E1127 16:21:21.176902 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a\": container with ID starting with def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a not found: ID does not exist" containerID="def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.176942 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a"} err="failed to get container status \"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a\": rpc error: code = NotFound desc = could not find container \"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a\": container with ID starting with def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a not found: ID does not exist" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.176970 4707 scope.go:117] "RemoveContainer" containerID="b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.177191 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449"} err="failed to get container status \"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449\": rpc error: code = NotFound desc = could not find container \"b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449\": container with ID starting with b087df4b6533d0e8bf93d44f0384d9beaa99c8e9a495fd6caa939c44221e6449 not found: ID does not exist" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.177216 4707 scope.go:117] "RemoveContainer" containerID="def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.177442 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a"} err="failed to get container status \"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a\": rpc error: code = NotFound desc = could not find container \"def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a\": container with ID starting with def29cbbe09d73f0f37e132863e13b12364202633606c39668a689c98190912a not found: ID does not exist" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.177550 4707 scope.go:117] "RemoveContainer" containerID="7aff2cfb12bf7c388208239d38004d97b94cfff4b0ec19c84f6cd38fab171961" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.180070 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sb7xq"] Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.181775 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2887e9f-7ff8-485d-ac06-0e06095ace94-kube-api-access-dqcl5" (OuterVolumeSpecName: "kube-api-access-dqcl5") pod "b2887e9f-7ff8-485d-ac06-0e06095ace94" (UID: "b2887e9f-7ff8-485d-ac06-0e06095ace94"). InnerVolumeSpecName "kube-api-access-dqcl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.193578 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-scripts" (OuterVolumeSpecName: "scripts") pod "b2887e9f-7ff8-485d-ac06-0e06095ace94" (UID: "b2887e9f-7ff8-485d-ac06-0e06095ace94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.203611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b2887e9f-7ff8-485d-ac06-0e06095ace94" (UID: "b2887e9f-7ff8-485d-ac06-0e06095ace94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.214601 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" path="/var/lib/kubelet/pods/a875a4be-fcc1-4b9e-b14e-5f0fc714639c/volumes" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.224174 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2887e9f-7ff8-485d-ac06-0e06095ace94" (UID: "b2887e9f-7ff8-485d-ac06-0e06095ace94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.233805 4707 scope.go:117] "RemoveContainer" containerID="17c0cefed402ffcff218aa6d85851adbe9d3432f15de54f2bf0abfea42623810" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.248448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data" (OuterVolumeSpecName: "config-data") pod "b2887e9f-7ff8-485d-ac06-0e06095ace94" (UID: "b2887e9f-7ff8-485d-ac06-0e06095ace94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.261044 4707 scope.go:117] "RemoveContainer" containerID="2436268214a92c7bab6dceee35e87e8bb53d7869e46baae1302454089f32d4a9" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.271009 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.271029 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b2887e9f-7ff8-485d-ac06-0e06095ace94-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.271040 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.271049 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.271058 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqcl5\" (UniqueName: \"kubernetes.io/projected/b2887e9f-7ff8-485d-ac06-0e06095ace94-kube-api-access-dqcl5\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.271066 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2887e9f-7ff8-485d-ac06-0e06095ace94-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.271074 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2887e9f-7ff8-485d-ac06-0e06095ace94-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.472643 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.478944 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.493825 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:21 crc kubenswrapper[4707]: E1127 16:21:21.494395 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerName="extract-utilities" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.494459 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerName="extract-utilities" Nov 27 16:21:21 crc kubenswrapper[4707]: E1127 16:21:21.494554 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerName="cinder-api" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.494602 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerName="cinder-api" Nov 27 16:21:21 crc kubenswrapper[4707]: E1127 16:21:21.494647 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerName="registry-server" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.494690 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerName="registry-server" Nov 27 16:21:21 crc kubenswrapper[4707]: E1127 16:21:21.494749 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerName="extract-content" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.494793 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerName="extract-content" Nov 27 16:21:21 crc kubenswrapper[4707]: E1127 16:21:21.494842 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerName="cinder-api-log" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.494885 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerName="cinder-api-log" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.495093 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerName="cinder-api" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.495156 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a875a4be-fcc1-4b9e-b14e-5f0fc714639c" containerName="registry-server" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.495224 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" containerName="cinder-api-log" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.496191 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.500911 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.501170 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.501503 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.512042 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575316 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-config-data\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575365 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98b666a0-7de5-45af-b604-c6fa48371681-etc-machine-id\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575407 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwcll\" (UniqueName: \"kubernetes.io/projected/98b666a0-7de5-45af-b604-c6fa48371681-kube-api-access-wwcll\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-public-tls-certs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575692 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-config-data-custom\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575727 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-scripts\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b666a0-7de5-45af-b604-c6fa48371681-logs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.575786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678135 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-config-data-custom\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678206 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-scripts\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b666a0-7de5-45af-b604-c6fa48371681-logs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678285 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678525 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-config-data\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98b666a0-7de5-45af-b604-c6fa48371681-etc-machine-id\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678647 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678690 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcll\" (UniqueName: \"kubernetes.io/projected/98b666a0-7de5-45af-b604-c6fa48371681-kube-api-access-wwcll\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.678739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-public-tls-certs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.679816 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98b666a0-7de5-45af-b604-c6fa48371681-etc-machine-id\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.680676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98b666a0-7de5-45af-b604-c6fa48371681-logs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.684983 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.688397 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-config-data\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.689311 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-public-tls-certs\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.690010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-scripts\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.692110 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-config-data-custom\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.698480 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98b666a0-7de5-45af-b604-c6fa48371681-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.710556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcll\" (UniqueName: \"kubernetes.io/projected/98b666a0-7de5-45af-b604-c6fa48371681-kube-api-access-wwcll\") pod \"cinder-api-0\" (UID: \"98b666a0-7de5-45af-b604-c6fa48371681\") " pod="openstack/cinder-api-0" Nov 27 16:21:21 crc kubenswrapper[4707]: I1127 16:21:21.809764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 16:21:22 crc kubenswrapper[4707]: I1127 16:21:22.099865 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:21:22 crc kubenswrapper[4707]: I1127 16:21:22.155243 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerStarted","Data":"781daef22c40165b16cc4b848299bb59d2a15fb97bba4a3c6e9b8e707327dcba"} Nov 27 16:21:22 crc kubenswrapper[4707]: I1127 16:21:22.156649 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 16:21:22 crc kubenswrapper[4707]: I1127 16:21:22.182931 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4587736040000001 podStartE2EDuration="5.182914203s" podCreationTimestamp="2025-11-27 16:21:17 +0000 UTC" firstStartedPulling="2025-11-27 16:21:17.961075884 +0000 UTC m=+1053.592524652" lastFinishedPulling="2025-11-27 16:21:21.685216443 +0000 UTC m=+1057.316665251" observedRunningTime="2025-11-27 16:21:22.179276316 +0000 UTC m=+1057.810725104" watchObservedRunningTime="2025-11-27 16:21:22.182914203 +0000 UTC m=+1057.814362971" Nov 27 16:21:22 crc kubenswrapper[4707]: I1127 16:21:22.318306 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 16:21:23 crc kubenswrapper[4707]: I1127 16:21:23.185321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"98b666a0-7de5-45af-b604-c6fa48371681","Type":"ContainerStarted","Data":"218cdd6f91f4e233a63f11d56ab3494ea5f38e315e0a4024b9ab472cf0eeb184"} Nov 27 16:21:23 crc kubenswrapper[4707]: I1127 16:21:23.185710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"98b666a0-7de5-45af-b604-c6fa48371681","Type":"ContainerStarted","Data":"b65c7c3faae7d6caf96ee3df906e9613f830c31fa9441ea3e45097ac8c2facae"} Nov 27 16:21:23 crc kubenswrapper[4707]: I1127 16:21:23.212496 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2887e9f-7ff8-485d-ac06-0e06095ace94" path="/var/lib/kubelet/pods/b2887e9f-7ff8-485d-ac06-0e06095ace94/volumes" Nov 27 16:21:24 crc kubenswrapper[4707]: I1127 16:21:24.196501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"98b666a0-7de5-45af-b604-c6fa48371681","Type":"ContainerStarted","Data":"da87d2aa815bccd80c48689dda8e791412dbf6b6ffaa25d6c7dda531a76b8d2a"} Nov 27 16:21:24 crc kubenswrapper[4707]: I1127 16:21:24.198604 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 27 16:21:24 crc kubenswrapper[4707]: I1127 16:21:24.226992 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.226971145 podStartE2EDuration="3.226971145s" podCreationTimestamp="2025-11-27 16:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:24.220674484 +0000 UTC m=+1059.852123262" watchObservedRunningTime="2025-11-27 16:21:24.226971145 +0000 UTC m=+1059.858419923" Nov 27 16:21:24 crc kubenswrapper[4707]: I1127 16:21:24.410350 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69588f8b9-6plc2" Nov 27 16:21:24 crc kubenswrapper[4707]: I1127 16:21:24.482056 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-784ccffcb8-pjrzr"] Nov 27 16:21:24 crc kubenswrapper[4707]: I1127 16:21:24.482559 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-784ccffcb8-pjrzr" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" containerName="neutron-api" containerID="cri-o://8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2" gracePeriod=30 Nov 27 16:21:24 crc kubenswrapper[4707]: I1127 16:21:24.482705 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-784ccffcb8-pjrzr" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" containerName="neutron-httpd" containerID="cri-o://a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb" gracePeriod=30 Nov 27 16:21:25 crc kubenswrapper[4707]: I1127 16:21:25.221129 4707 generic.go:334] "Generic (PLEG): container finished" podID="0c553d55-f2dd-404c-bb41-379922a29a20" containerID="a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb" exitCode=0 Nov 27 16:21:25 crc kubenswrapper[4707]: I1127 16:21:25.231897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-784ccffcb8-pjrzr" event={"ID":"0c553d55-f2dd-404c-bb41-379922a29a20","Type":"ContainerDied","Data":"a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb"} Nov 27 16:21:25 crc kubenswrapper[4707]: I1127 16:21:25.517548 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:25 crc kubenswrapper[4707]: I1127 16:21:25.577924 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-zhndc"] Nov 27 16:21:25 crc kubenswrapper[4707]: I1127 16:21:25.578226 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" podUID="6c94a5dc-169c-47ba-a007-307336246c92" containerName="dnsmasq-dns" containerID="cri-o://f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b" gracePeriod=10 Nov 27 16:21:25 crc kubenswrapper[4707]: I1127 16:21:25.658028 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 27 16:21:25 crc kubenswrapper[4707]: I1127 16:21:25.726612 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.145284 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.167738 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-sb\") pod \"6c94a5dc-169c-47ba-a007-307336246c92\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.167808 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-svc\") pod \"6c94a5dc-169c-47ba-a007-307336246c92\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.167847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-config\") pod \"6c94a5dc-169c-47ba-a007-307336246c92\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.167952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-799q4\" (UniqueName: \"kubernetes.io/projected/6c94a5dc-169c-47ba-a007-307336246c92-kube-api-access-799q4\") pod \"6c94a5dc-169c-47ba-a007-307336246c92\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.168053 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-swift-storage-0\") pod \"6c94a5dc-169c-47ba-a007-307336246c92\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.168079 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-nb\") pod \"6c94a5dc-169c-47ba-a007-307336246c92\" (UID: \"6c94a5dc-169c-47ba-a007-307336246c92\") " Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.176615 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c94a5dc-169c-47ba-a007-307336246c92-kube-api-access-799q4" (OuterVolumeSpecName: "kube-api-access-799q4") pod "6c94a5dc-169c-47ba-a007-307336246c92" (UID: "6c94a5dc-169c-47ba-a007-307336246c92"). InnerVolumeSpecName "kube-api-access-799q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.224207 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c94a5dc-169c-47ba-a007-307336246c92" (UID: "6c94a5dc-169c-47ba-a007-307336246c92"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.230423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c94a5dc-169c-47ba-a007-307336246c92" (UID: "6c94a5dc-169c-47ba-a007-307336246c92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.236244 4707 generic.go:334] "Generic (PLEG): container finished" podID="6c94a5dc-169c-47ba-a007-307336246c92" containerID="f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b" exitCode=0 Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.237581 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.238267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" event={"ID":"6c94a5dc-169c-47ba-a007-307336246c92","Type":"ContainerDied","Data":"f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b"} Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.238314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-zhndc" event={"ID":"6c94a5dc-169c-47ba-a007-307336246c92","Type":"ContainerDied","Data":"09b4e62750c919c8e8abfe7d2db43ed3ce8557fd6400f25a0bc2f99c9071316e"} Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.238341 4707 scope.go:117] "RemoveContainer" containerID="f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.238678 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerName="cinder-scheduler" containerID="cri-o://6cd0fefc6b2d8d18ccb7293279fbe2edae89a3cdbe2ac921d1f5e30dc34d7012" gracePeriod=30 Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.239010 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerName="probe" containerID="cri-o://7063e359332cb179d8d91d30c0eee57acebfa09b276fe0c419ec11711cfc18fe" gracePeriod=30 Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.240945 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c94a5dc-169c-47ba-a007-307336246c92" (UID: "6c94a5dc-169c-47ba-a007-307336246c92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.249970 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c94a5dc-169c-47ba-a007-307336246c92" (UID: "6c94a5dc-169c-47ba-a007-307336246c92"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.252119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-config" (OuterVolumeSpecName: "config") pod "6c94a5dc-169c-47ba-a007-307336246c92" (UID: "6c94a5dc-169c-47ba-a007-307336246c92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.270251 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.270284 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.270295 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.270305 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-799q4\" (UniqueName: \"kubernetes.io/projected/6c94a5dc-169c-47ba-a007-307336246c92-kube-api-access-799q4\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.270316 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.270324 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c94a5dc-169c-47ba-a007-307336246c92-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.298643 4707 scope.go:117] "RemoveContainer" containerID="bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.315549 4707 scope.go:117] "RemoveContainer" containerID="f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b" Nov 27 16:21:26 crc kubenswrapper[4707]: E1127 16:21:26.316089 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b\": container with ID starting with f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b not found: ID does not exist" containerID="f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.316136 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b"} err="failed to get container status \"f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b\": rpc error: code = NotFound desc = could not find container \"f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b\": container with ID starting with f0f1e00f80a396a3c026d03115b811121601ee4d62fad2106017d28deff77b8b not found: ID does not exist" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.316161 4707 scope.go:117] "RemoveContainer" containerID="bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8" Nov 27 16:21:26 crc kubenswrapper[4707]: E1127 16:21:26.316442 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8\": container with ID starting with bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8 not found: ID does not exist" containerID="bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.316472 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8"} err="failed to get container status \"bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8\": rpc error: code = NotFound desc = could not find container \"bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8\": container with ID starting with bc578ee4f581c31397f30fc43ed9cb1782c03b33e59d0711749432c7aa3232e8 not found: ID does not exist" Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.575362 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-zhndc"] Nov 27 16:21:26 crc kubenswrapper[4707]: I1127 16:21:26.583528 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-zhndc"] Nov 27 16:21:27 crc kubenswrapper[4707]: I1127 16:21:27.209602 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c94a5dc-169c-47ba-a007-307336246c92" path="/var/lib/kubelet/pods/6c94a5dc-169c-47ba-a007-307336246c92/volumes" Nov 27 16:21:27 crc kubenswrapper[4707]: I1127 16:21:27.252631 4707 generic.go:334] "Generic (PLEG): container finished" podID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerID="7063e359332cb179d8d91d30c0eee57acebfa09b276fe0c419ec11711cfc18fe" exitCode=0 Nov 27 16:21:27 crc kubenswrapper[4707]: I1127 16:21:27.252696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13a70ad2-b029-45d8-959e-db6823a68b6a","Type":"ContainerDied","Data":"7063e359332cb179d8d91d30c0eee57acebfa09b276fe0c419ec11711cfc18fe"} Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.267743 4707 generic.go:334] "Generic (PLEG): container finished" podID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerID="6cd0fefc6b2d8d18ccb7293279fbe2edae89a3cdbe2ac921d1f5e30dc34d7012" exitCode=0 Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.267791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13a70ad2-b029-45d8-959e-db6823a68b6a","Type":"ContainerDied","Data":"6cd0fefc6b2d8d18ccb7293279fbe2edae89a3cdbe2ac921d1f5e30dc34d7012"} Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.446742 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.514505 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data-custom\") pod \"13a70ad2-b029-45d8-959e-db6823a68b6a\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.514583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-combined-ca-bundle\") pod \"13a70ad2-b029-45d8-959e-db6823a68b6a\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.514628 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rdjd\" (UniqueName: \"kubernetes.io/projected/13a70ad2-b029-45d8-959e-db6823a68b6a-kube-api-access-2rdjd\") pod \"13a70ad2-b029-45d8-959e-db6823a68b6a\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.514681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-scripts\") pod \"13a70ad2-b029-45d8-959e-db6823a68b6a\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.514721 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data\") pod \"13a70ad2-b029-45d8-959e-db6823a68b6a\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.514750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13a70ad2-b029-45d8-959e-db6823a68b6a-etc-machine-id\") pod \"13a70ad2-b029-45d8-959e-db6823a68b6a\" (UID: \"13a70ad2-b029-45d8-959e-db6823a68b6a\") " Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.515299 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13a70ad2-b029-45d8-959e-db6823a68b6a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "13a70ad2-b029-45d8-959e-db6823a68b6a" (UID: "13a70ad2-b029-45d8-959e-db6823a68b6a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.522803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-scripts" (OuterVolumeSpecName: "scripts") pod "13a70ad2-b029-45d8-959e-db6823a68b6a" (UID: "13a70ad2-b029-45d8-959e-db6823a68b6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.527736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a70ad2-b029-45d8-959e-db6823a68b6a-kube-api-access-2rdjd" (OuterVolumeSpecName: "kube-api-access-2rdjd") pod "13a70ad2-b029-45d8-959e-db6823a68b6a" (UID: "13a70ad2-b029-45d8-959e-db6823a68b6a"). InnerVolumeSpecName "kube-api-access-2rdjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.528365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13a70ad2-b029-45d8-959e-db6823a68b6a" (UID: "13a70ad2-b029-45d8-959e-db6823a68b6a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.575267 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13a70ad2-b029-45d8-959e-db6823a68b6a" (UID: "13a70ad2-b029-45d8-959e-db6823a68b6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.617516 4707 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13a70ad2-b029-45d8-959e-db6823a68b6a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.617543 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.617553 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.617567 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rdjd\" (UniqueName: \"kubernetes.io/projected/13a70ad2-b029-45d8-959e-db6823a68b6a-kube-api-access-2rdjd\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.617578 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.645564 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data" (OuterVolumeSpecName: "config-data") pod "13a70ad2-b029-45d8-959e-db6823a68b6a" (UID: "13a70ad2-b029-45d8-959e-db6823a68b6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:28 crc kubenswrapper[4707]: I1127 16:21:28.719609 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13a70ad2-b029-45d8-959e-db6823a68b6a-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.173874 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.230151 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-httpd-config\") pod \"0c553d55-f2dd-404c-bb41-379922a29a20\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.230316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-combined-ca-bundle\") pod \"0c553d55-f2dd-404c-bb41-379922a29a20\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.230450 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8d5h\" (UniqueName: \"kubernetes.io/projected/0c553d55-f2dd-404c-bb41-379922a29a20-kube-api-access-l8d5h\") pod \"0c553d55-f2dd-404c-bb41-379922a29a20\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.230537 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-ovndb-tls-certs\") pod \"0c553d55-f2dd-404c-bb41-379922a29a20\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.230594 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-config\") pod \"0c553d55-f2dd-404c-bb41-379922a29a20\" (UID: \"0c553d55-f2dd-404c-bb41-379922a29a20\") " Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.236560 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0c553d55-f2dd-404c-bb41-379922a29a20" (UID: "0c553d55-f2dd-404c-bb41-379922a29a20"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.238213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c553d55-f2dd-404c-bb41-379922a29a20-kube-api-access-l8d5h" (OuterVolumeSpecName: "kube-api-access-l8d5h") pod "0c553d55-f2dd-404c-bb41-379922a29a20" (UID: "0c553d55-f2dd-404c-bb41-379922a29a20"). InnerVolumeSpecName "kube-api-access-l8d5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.285358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13a70ad2-b029-45d8-959e-db6823a68b6a","Type":"ContainerDied","Data":"76bb8540a5909815f92b6e5dd198fe761f3b685776d2680a0fc823b0dea09bcb"} Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.285448 4707 scope.go:117] "RemoveContainer" containerID="7063e359332cb179d8d91d30c0eee57acebfa09b276fe0c419ec11711cfc18fe" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.285599 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.290720 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-config" (OuterVolumeSpecName: "config") pod "0c553d55-f2dd-404c-bb41-379922a29a20" (UID: "0c553d55-f2dd-404c-bb41-379922a29a20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.296474 4707 generic.go:334] "Generic (PLEG): container finished" podID="0c553d55-f2dd-404c-bb41-379922a29a20" containerID="8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2" exitCode=0 Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.296521 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-784ccffcb8-pjrzr" event={"ID":"0c553d55-f2dd-404c-bb41-379922a29a20","Type":"ContainerDied","Data":"8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2"} Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.296558 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-784ccffcb8-pjrzr" event={"ID":"0c553d55-f2dd-404c-bb41-379922a29a20","Type":"ContainerDied","Data":"3f2651b3b95d241d6cd589498ba66ab0a091f48da4bf3df45eb293072e249c10"} Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.296636 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-784ccffcb8-pjrzr" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.302729 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c553d55-f2dd-404c-bb41-379922a29a20" (UID: "0c553d55-f2dd-404c-bb41-379922a29a20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.326836 4707 scope.go:117] "RemoveContainer" containerID="6cd0fefc6b2d8d18ccb7293279fbe2edae89a3cdbe2ac921d1f5e30dc34d7012" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.338937 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8d5h\" (UniqueName: \"kubernetes.io/projected/0c553d55-f2dd-404c-bb41-379922a29a20-kube-api-access-l8d5h\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.338963 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.338976 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.338988 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.341949 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.367741 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0c553d55-f2dd-404c-bb41-379922a29a20" (UID: "0c553d55-f2dd-404c-bb41-379922a29a20"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.377585 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.392633 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:29 crc kubenswrapper[4707]: E1127 16:21:29.393093 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerName="probe" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393114 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerName="probe" Nov 27 16:21:29 crc kubenswrapper[4707]: E1127 16:21:29.393130 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" containerName="neutron-api" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393139 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" containerName="neutron-api" Nov 27 16:21:29 crc kubenswrapper[4707]: E1127 16:21:29.393155 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" containerName="neutron-httpd" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393164 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" containerName="neutron-httpd" Nov 27 16:21:29 crc kubenswrapper[4707]: E1127 16:21:29.393180 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerName="cinder-scheduler" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393189 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerName="cinder-scheduler" Nov 27 16:21:29 crc kubenswrapper[4707]: E1127 16:21:29.393221 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c94a5dc-169c-47ba-a007-307336246c92" containerName="init" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393229 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c94a5dc-169c-47ba-a007-307336246c92" containerName="init" Nov 27 16:21:29 crc kubenswrapper[4707]: E1127 16:21:29.393244 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c94a5dc-169c-47ba-a007-307336246c92" containerName="dnsmasq-dns" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393252 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c94a5dc-169c-47ba-a007-307336246c92" containerName="dnsmasq-dns" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393563 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerName="cinder-scheduler" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393590 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" containerName="probe" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393607 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" containerName="neutron-api" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393629 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" containerName="neutron-httpd" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.393654 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c94a5dc-169c-47ba-a007-307336246c92" containerName="dnsmasq-dns" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.395035 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.397749 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.398535 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.439825 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.439862 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.439890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7198c845-6481-4a99-b508-b3da40447ba6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.439939 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcwjc\" (UniqueName: \"kubernetes.io/projected/7198c845-6481-4a99-b508-b3da40447ba6-kube-api-access-kcwjc\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.439985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-config-data\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.440022 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-scripts\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.440084 4707 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c553d55-f2dd-404c-bb41-379922a29a20-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.440425 4707 scope.go:117] "RemoveContainer" containerID="a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.467792 4707 scope.go:117] "RemoveContainer" containerID="8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.485354 4707 scope.go:117] "RemoveContainer" containerID="a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb" Nov 27 16:21:29 crc kubenswrapper[4707]: E1127 16:21:29.486630 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb\": container with ID starting with a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb not found: ID does not exist" containerID="a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.486661 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb"} err="failed to get container status \"a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb\": rpc error: code = NotFound desc = could not find container \"a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb\": container with ID starting with a15ab2bc546039cc92636f8003591e2da2898c85e5d58bab75f5ae9b2adeb5cb not found: ID does not exist" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.486682 4707 scope.go:117] "RemoveContainer" containerID="8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2" Nov 27 16:21:29 crc kubenswrapper[4707]: E1127 16:21:29.486918 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2\": container with ID starting with 8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2 not found: ID does not exist" containerID="8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.486940 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2"} err="failed to get container status \"8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2\": rpc error: code = NotFound desc = could not find container \"8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2\": container with ID starting with 8ffbf91e00c6dac33fbfd7b6f8ac9e2bcbaac4bc89db47e3a51003fcbca41aa2 not found: ID does not exist" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.541655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.541697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.541721 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7198c845-6481-4a99-b508-b3da40447ba6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.541767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcwjc\" (UniqueName: \"kubernetes.io/projected/7198c845-6481-4a99-b508-b3da40447ba6-kube-api-access-kcwjc\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.541805 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-config-data\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.541845 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-scripts\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.542044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7198c845-6481-4a99-b508-b3da40447ba6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.545847 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-config-data\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.545953 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-scripts\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.546968 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.555156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7198c845-6481-4a99-b508-b3da40447ba6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.558475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcwjc\" (UniqueName: \"kubernetes.io/projected/7198c845-6481-4a99-b508-b3da40447ba6-kube-api-access-kcwjc\") pod \"cinder-scheduler-0\" (UID: \"7198c845-6481-4a99-b508-b3da40447ba6\") " pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.670682 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-784ccffcb8-pjrzr"] Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.684472 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-784ccffcb8-pjrzr"] Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.738736 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 16:21:29 crc kubenswrapper[4707]: I1127 16:21:29.803650 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-85c75486d4-pmkdv" Nov 27 16:21:30 crc kubenswrapper[4707]: I1127 16:21:30.195573 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 16:21:30 crc kubenswrapper[4707]: W1127 16:21:30.201457 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7198c845_6481_4a99_b508_b3da40447ba6.slice/crio-1268e548120d76a888f459414d4c2a3e5a5af9100cfe9463819b43c36b5d8547 WatchSource:0}: Error finding container 1268e548120d76a888f459414d4c2a3e5a5af9100cfe9463819b43c36b5d8547: Status 404 returned error can't find the container with id 1268e548120d76a888f459414d4c2a3e5a5af9100cfe9463819b43c36b5d8547 Nov 27 16:21:30 crc kubenswrapper[4707]: I1127 16:21:30.313446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7198c845-6481-4a99-b508-b3da40447ba6","Type":"ContainerStarted","Data":"1268e548120d76a888f459414d4c2a3e5a5af9100cfe9463819b43c36b5d8547"} Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.205961 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c553d55-f2dd-404c-bb41-379922a29a20" path="/var/lib/kubelet/pods/0c553d55-f2dd-404c-bb41-379922a29a20/volumes" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.207055 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a70ad2-b029-45d8-959e-db6823a68b6a" path="/var/lib/kubelet/pods/13a70ad2-b029-45d8-959e-db6823a68b6a/volumes" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.330418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7198c845-6481-4a99-b508-b3da40447ba6","Type":"ContainerStarted","Data":"0f7766d7f4a1d16f2fbf8933d21ff0469535f2d19a04dfbbc89e46a4385f81f6"} Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.462601 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.463770 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.465059 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.466788 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.466916 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gv4ff" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.478631 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.573816 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-openstack-config-secret\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.573860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvr5w\" (UniqueName: \"kubernetes.io/projected/b281d412-34bb-4169-8c35-65318084ab97-kube-api-access-gvr5w\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.573892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.573954 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b281d412-34bb-4169-8c35-65318084ab97-openstack-config\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.676234 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-openstack-config-secret\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.676301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvr5w\" (UniqueName: \"kubernetes.io/projected/b281d412-34bb-4169-8c35-65318084ab97-kube-api-access-gvr5w\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.676947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.677726 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b281d412-34bb-4169-8c35-65318084ab97-openstack-config\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.679327 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b281d412-34bb-4169-8c35-65318084ab97-openstack-config\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.684119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.684210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-openstack-config-secret\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.693065 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvr5w\" (UniqueName: \"kubernetes.io/projected/b281d412-34bb-4169-8c35-65318084ab97-kube-api-access-gvr5w\") pod \"openstackclient\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " pod="openstack/openstackclient" Nov 27 16:21:31 crc kubenswrapper[4707]: I1127 16:21:31.799619 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:21:32 crc kubenswrapper[4707]: I1127 16:21:32.239988 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 16:21:32 crc kubenswrapper[4707]: I1127 16:21:32.341745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7198c845-6481-4a99-b508-b3da40447ba6","Type":"ContainerStarted","Data":"4b27092a6dfab040e07166ab8457b00e3f75c457dca4c625abcf36aac8678433"} Nov 27 16:21:32 crc kubenswrapper[4707]: I1127 16:21:32.343899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b281d412-34bb-4169-8c35-65318084ab97","Type":"ContainerStarted","Data":"365dba7cbd3fcf5acff3ac9eef8ab46e8e5e28225ca562ba517fded9e3ddc82e"} Nov 27 16:21:32 crc kubenswrapper[4707]: I1127 16:21:32.367339 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.367321324 podStartE2EDuration="3.367321324s" podCreationTimestamp="2025-11-27 16:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:32.359051925 +0000 UTC m=+1067.990500693" watchObservedRunningTime="2025-11-27 16:21:32.367321324 +0000 UTC m=+1067.998770102" Nov 27 16:21:33 crc kubenswrapper[4707]: I1127 16:21:33.593619 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 27 16:21:34 crc kubenswrapper[4707]: I1127 16:21:34.738980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.126975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.127786 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="ceilometer-central-agent" containerID="cri-o://d81248bbf232c36a76ada9937a510da6bfacd6954797306990a4e5145d9c0e2c" gracePeriod=30 Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.127909 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="proxy-httpd" containerID="cri-o://781daef22c40165b16cc4b848299bb59d2a15fb97bba4a3c6e9b8e707327dcba" gracePeriod=30 Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.127954 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="sg-core" containerID="cri-o://ed7e45af14424d36b8d34d430ff20fe0a559a93ab0e83b7a6e5c3d1973b78ae1" gracePeriod=30 Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.127996 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="ceilometer-notification-agent" containerID="cri-o://2cc08f712dd41ada2b39f74e2e2a7cde303e8246eb9d310e11ff71490a4cfdd5" gracePeriod=30 Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.142249 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": EOF" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.273130 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5475fd4f89-8stjv"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.274612 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.276638 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7362e7e3-1145-4e89-84db-343739624472-etc-swift\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.276741 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7362e7e3-1145-4e89-84db-343739624472-log-httpd\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.276775 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-public-tls-certs\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.276816 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-config-data\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.276835 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-internal-tls-certs\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.276862 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbml\" (UniqueName: \"kubernetes.io/projected/7362e7e3-1145-4e89-84db-343739624472-kube-api-access-nxbml\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.276916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7362e7e3-1145-4e89-84db-343739624472-run-httpd\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.276938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-combined-ca-bundle\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.279527 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.279591 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.279681 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.282666 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5475fd4f89-8stjv"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.377980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7362e7e3-1145-4e89-84db-343739624472-run-httpd\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.378026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-combined-ca-bundle\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.378055 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7362e7e3-1145-4e89-84db-343739624472-etc-swift\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.378540 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7362e7e3-1145-4e89-84db-343739624472-log-httpd\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.378589 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-public-tls-certs\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.378641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-config-data\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.378661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-internal-tls-certs\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.378682 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbml\" (UniqueName: \"kubernetes.io/projected/7362e7e3-1145-4e89-84db-343739624472-kube-api-access-nxbml\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.379228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7362e7e3-1145-4e89-84db-343739624472-log-httpd\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.379595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7362e7e3-1145-4e89-84db-343739624472-run-httpd\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.388281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-combined-ca-bundle\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.390811 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-public-tls-certs\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.392227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-internal-tls-certs\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.392487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7362e7e3-1145-4e89-84db-343739624472-etc-swift\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.394873 4707 generic.go:334] "Generic (PLEG): container finished" podID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerID="781daef22c40165b16cc4b848299bb59d2a15fb97bba4a3c6e9b8e707327dcba" exitCode=0 Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.394899 4707 generic.go:334] "Generic (PLEG): container finished" podID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerID="ed7e45af14424d36b8d34d430ff20fe0a559a93ab0e83b7a6e5c3d1973b78ae1" exitCode=2 Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.394920 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerDied","Data":"781daef22c40165b16cc4b848299bb59d2a15fb97bba4a3c6e9b8e707327dcba"} Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.394951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerDied","Data":"ed7e45af14424d36b8d34d430ff20fe0a559a93ab0e83b7a6e5c3d1973b78ae1"} Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.399758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7362e7e3-1145-4e89-84db-343739624472-config-data\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.400765 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbml\" (UniqueName: \"kubernetes.io/projected/7362e7e3-1145-4e89-84db-343739624472-kube-api-access-nxbml\") pod \"swift-proxy-5475fd4f89-8stjv\" (UID: \"7362e7e3-1145-4e89-84db-343739624472\") " pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.612393 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7678b8b68b-mp4qf"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.613824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.615857 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.616354 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.616540 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-s8xhq" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.636227 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7678b8b68b-mp4qf"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.636707 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.718511 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-qkfjh"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.720001 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.739612 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5584d95bf4-c6hwk"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.740764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.744877 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.783492 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5584d95bf4-c6hwk"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.789284 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.789501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-combined-ca-bundle\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.789639 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data-custom\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.789731 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4nkc\" (UniqueName: \"kubernetes.io/projected/5e02884e-9f0d-45a5-a916-aa4018402ee4-kube-api-access-b4nkc\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.806921 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-qkfjh"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.858257 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7db44cddd-l4rvd"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.863807 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.867568 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.882424 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7db44cddd-l4rvd"] Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.890946 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.890985 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm2kn\" (UniqueName: \"kubernetes.io/projected/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-kube-api-access-xm2kn\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891012 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-config\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891077 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data-custom\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891163 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-combined-ca-bundle\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891201 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891220 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rvlp\" (UniqueName: \"kubernetes.io/projected/2e50d632-6bfc-48aa-ab32-f0a05105b482-kube-api-access-7rvlp\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data-custom\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-combined-ca-bundle\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891306 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.891325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4nkc\" (UniqueName: \"kubernetes.io/projected/5e02884e-9f0d-45a5-a916-aa4018402ee4-kube-api-access-b4nkc\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.898258 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.900982 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data-custom\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.920598 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-combined-ca-bundle\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.925507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4nkc\" (UniqueName: \"kubernetes.io/projected/5e02884e-9f0d-45a5-a916-aa4018402ee4-kube-api-access-b4nkc\") pod \"heat-engine-7678b8b68b-mp4qf\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.940227 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.992833 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.992892 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data-custom\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.992941 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.992969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.992990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rvlp\" (UniqueName: \"kubernetes.io/projected/2e50d632-6bfc-48aa-ab32-f0a05105b482-kube-api-access-7rvlp\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993010 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-combined-ca-bundle\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993073 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42q7j\" (UniqueName: \"kubernetes.io/projected/75326248-4957-4086-ad33-0a8c76ec7ff5-kube-api-access-42q7j\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm2kn\" (UniqueName: \"kubernetes.io/projected/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-kube-api-access-xm2kn\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993128 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993149 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-config\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-combined-ca-bundle\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data-custom\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.993689 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.994309 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.994439 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.994615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:37 crc kubenswrapper[4707]: I1127 16:21:37.995127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-config\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.006127 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data-custom\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.006194 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.009625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm2kn\" (UniqueName: \"kubernetes.io/projected/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-kube-api-access-xm2kn\") pod \"dnsmasq-dns-7756b9d78c-qkfjh\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.012779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-combined-ca-bundle\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.016872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rvlp\" (UniqueName: \"kubernetes.io/projected/2e50d632-6bfc-48aa-ab32-f0a05105b482-kube-api-access-7rvlp\") pod \"heat-cfnapi-5584d95bf4-c6hwk\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.050132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.060271 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.094641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42q7j\" (UniqueName: \"kubernetes.io/projected/75326248-4957-4086-ad33-0a8c76ec7ff5-kube-api-access-42q7j\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.094701 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-combined-ca-bundle\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.094772 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data-custom\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.094826 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.098419 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data-custom\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.099469 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.107879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-combined-ca-bundle\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.116427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42q7j\" (UniqueName: \"kubernetes.io/projected/75326248-4957-4086-ad33-0a8c76ec7ff5-kube-api-access-42q7j\") pod \"heat-api-7db44cddd-l4rvd\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.275573 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.410661 4707 generic.go:334] "Generic (PLEG): container finished" podID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerID="d81248bbf232c36a76ada9937a510da6bfacd6954797306990a4e5145d9c0e2c" exitCode=0 Nov 27 16:21:38 crc kubenswrapper[4707]: I1127 16:21:38.410703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerDied","Data":"d81248bbf232c36a76ada9937a510da6bfacd6954797306990a4e5145d9c0e2c"} Nov 27 16:21:39 crc kubenswrapper[4707]: I1127 16:21:39.460563 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:39 crc kubenswrapper[4707]: I1127 16:21:39.557726 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-69b8fb6b88-w6pxv" Nov 27 16:21:39 crc kubenswrapper[4707]: I1127 16:21:39.937743 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 27 16:21:40 crc kubenswrapper[4707]: I1127 16:21:40.438198 4707 generic.go:334] "Generic (PLEG): container finished" podID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerID="2cc08f712dd41ada2b39f74e2e2a7cde303e8246eb9d310e11ff71490a4cfdd5" exitCode=0 Nov 27 16:21:40 crc kubenswrapper[4707]: I1127 16:21:40.438298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerDied","Data":"2cc08f712dd41ada2b39f74e2e2a7cde303e8246eb9d310e11ff71490a4cfdd5"} Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.595258 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.673723 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-log-httpd\") pod \"20cd323d-9f2e-472f-8c75-9b413bbdb303\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.673788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-scripts\") pod \"20cd323d-9f2e-472f-8c75-9b413bbdb303\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.673853 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-sg-core-conf-yaml\") pod \"20cd323d-9f2e-472f-8c75-9b413bbdb303\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.673869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-combined-ca-bundle\") pod \"20cd323d-9f2e-472f-8c75-9b413bbdb303\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.673915 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcbx4\" (UniqueName: \"kubernetes.io/projected/20cd323d-9f2e-472f-8c75-9b413bbdb303-kube-api-access-kcbx4\") pod \"20cd323d-9f2e-472f-8c75-9b413bbdb303\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.673954 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-config-data\") pod \"20cd323d-9f2e-472f-8c75-9b413bbdb303\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.674043 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-run-httpd\") pod \"20cd323d-9f2e-472f-8c75-9b413bbdb303\" (UID: \"20cd323d-9f2e-472f-8c75-9b413bbdb303\") " Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.674820 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "20cd323d-9f2e-472f-8c75-9b413bbdb303" (UID: "20cd323d-9f2e-472f-8c75-9b413bbdb303"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.677454 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "20cd323d-9f2e-472f-8c75-9b413bbdb303" (UID: "20cd323d-9f2e-472f-8c75-9b413bbdb303"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.683988 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cd323d-9f2e-472f-8c75-9b413bbdb303-kube-api-access-kcbx4" (OuterVolumeSpecName: "kube-api-access-kcbx4") pod "20cd323d-9f2e-472f-8c75-9b413bbdb303" (UID: "20cd323d-9f2e-472f-8c75-9b413bbdb303"). InnerVolumeSpecName "kube-api-access-kcbx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.687382 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-scripts" (OuterVolumeSpecName: "scripts") pod "20cd323d-9f2e-472f-8c75-9b413bbdb303" (UID: "20cd323d-9f2e-472f-8c75-9b413bbdb303"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.746721 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7678b8b68b-mp4qf"] Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.757597 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7db44cddd-l4rvd"] Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.776220 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.776240 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.776249 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcbx4\" (UniqueName: \"kubernetes.io/projected/20cd323d-9f2e-472f-8c75-9b413bbdb303-kube-api-access-kcbx4\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.776259 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/20cd323d-9f2e-472f-8c75-9b413bbdb303-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.777591 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-qkfjh"] Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.777996 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "20cd323d-9f2e-472f-8c75-9b413bbdb303" (UID: "20cd323d-9f2e-472f-8c75-9b413bbdb303"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.836571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20cd323d-9f2e-472f-8c75-9b413bbdb303" (UID: "20cd323d-9f2e-472f-8c75-9b413bbdb303"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.840160 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-config-data" (OuterVolumeSpecName: "config-data") pod "20cd323d-9f2e-472f-8c75-9b413bbdb303" (UID: "20cd323d-9f2e-472f-8c75-9b413bbdb303"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.845360 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5475fd4f89-8stjv"] Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.877581 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.877619 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.877635 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cd323d-9f2e-472f-8c75-9b413bbdb303-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:42 crc kubenswrapper[4707]: I1127 16:21:42.937031 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5584d95bf4-c6hwk"] Nov 27 16:21:42 crc kubenswrapper[4707]: W1127 16:21:42.952001 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e50d632_6bfc_48aa_ab32_f0a05105b482.slice/crio-d8ecab08fcc00ebc6c938a2e7eaf8c560af64bdb8d4e8acc69e5cb6e416337f3 WatchSource:0}: Error finding container d8ecab08fcc00ebc6c938a2e7eaf8c560af64bdb8d4e8acc69e5cb6e416337f3: Status 404 returned error can't find the container with id d8ecab08fcc00ebc6c938a2e7eaf8c560af64bdb8d4e8acc69e5cb6e416337f3 Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.510270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"20cd323d-9f2e-472f-8c75-9b413bbdb303","Type":"ContainerDied","Data":"147d1755f0e7d809d92afe9883261d09ff79b624a6e4a02aacdcb813b16d7957"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.510553 4707 scope.go:117] "RemoveContainer" containerID="781daef22c40165b16cc4b848299bb59d2a15fb97bba4a3c6e9b8e707327dcba" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.510685 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.533649 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b281d412-34bb-4169-8c35-65318084ab97","Type":"ContainerStarted","Data":"0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.565128 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.573741 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" event={"ID":"2e50d632-6bfc-48aa-ab32-f0a05105b482","Type":"ContainerStarted","Data":"d8ecab08fcc00ebc6c938a2e7eaf8c560af64bdb8d4e8acc69e5cb6e416337f3"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.598605 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.615616 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.095474553 podStartE2EDuration="12.615595241s" podCreationTimestamp="2025-11-27 16:21:31 +0000 UTC" firstStartedPulling="2025-11-27 16:21:32.240824034 +0000 UTC m=+1067.872272802" lastFinishedPulling="2025-11-27 16:21:42.760944722 +0000 UTC m=+1078.392393490" observedRunningTime="2025-11-27 16:21:43.580900817 +0000 UTC m=+1079.212349585" watchObservedRunningTime="2025-11-27 16:21:43.615595241 +0000 UTC m=+1079.247043999" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.632131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5475fd4f89-8stjv" event={"ID":"7362e7e3-1145-4e89-84db-343739624472","Type":"ContainerStarted","Data":"a4d7489fd999a902c74c2038c600305a09d990fa525c6ff756d521412c403338"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.632180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5475fd4f89-8stjv" event={"ID":"7362e7e3-1145-4e89-84db-343739624472","Type":"ContainerStarted","Data":"29dc620eedb6087cbfebc3936be43670655de7fa19aa53496e12f6bc201e46cc"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.632191 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5475fd4f89-8stjv" event={"ID":"7362e7e3-1145-4e89-84db-343739624472","Type":"ContainerStarted","Data":"cf1573a5ade58f67af52fb811152f8e6c720cd6fc64f23a345e279ad94fb501e"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.633715 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.633748 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.639036 4707 scope.go:117] "RemoveContainer" containerID="ed7e45af14424d36b8d34d430ff20fe0a559a93ab0e83b7a6e5c3d1973b78ae1" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.672498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7678b8b68b-mp4qf" event={"ID":"5e02884e-9f0d-45a5-a916-aa4018402ee4","Type":"ContainerStarted","Data":"f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.672536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7678b8b68b-mp4qf" event={"ID":"5e02884e-9f0d-45a5-a916-aa4018402ee4","Type":"ContainerStarted","Data":"a1993661c976ee955d8bf66b2db44fa30e6ae26d13ba828b35f644849cea1f58"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.672726 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.680232 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:43 crc kubenswrapper[4707]: E1127 16:21:43.681226 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="ceilometer-central-agent" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.681241 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="ceilometer-central-agent" Nov 27 16:21:43 crc kubenswrapper[4707]: E1127 16:21:43.681279 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="sg-core" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.681285 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="sg-core" Nov 27 16:21:43 crc kubenswrapper[4707]: E1127 16:21:43.681305 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="ceilometer-notification-agent" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.681312 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="ceilometer-notification-agent" Nov 27 16:21:43 crc kubenswrapper[4707]: E1127 16:21:43.681325 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="proxy-httpd" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.681331 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="proxy-httpd" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.681649 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="ceilometer-notification-agent" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.681676 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="proxy-httpd" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.681704 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="sg-core" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.681715 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" containerName="ceilometer-central-agent" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.687158 4707 scope.go:117] "RemoveContainer" containerID="2cc08f712dd41ada2b39f74e2e2a7cde303e8246eb9d310e11ff71490a4cfdd5" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.688735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7db44cddd-l4rvd" event={"ID":"75326248-4957-4086-ad33-0a8c76ec7ff5","Type":"ContainerStarted","Data":"e0f3a4859e733cdff613c48694b31901fc474c899362269725bb6692014f31d2"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.688831 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.700348 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5475fd4f89-8stjv" podStartSLOduration=6.700325437 podStartE2EDuration="6.700325437s" podCreationTimestamp="2025-11-27 16:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:43.668237116 +0000 UTC m=+1079.299685884" watchObservedRunningTime="2025-11-27 16:21:43.700325437 +0000 UTC m=+1079.331774205" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.700762 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.701237 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.701828 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.706906 4707 generic.go:334] "Generic (PLEG): container finished" podID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" containerID="e5e380e29c2d6cdefe875b298854cca2d1db3fba3d77ef6ea3722a376005fec5" exitCode=0 Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.706941 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" event={"ID":"254d0a1b-ea0e-4d0c-8a20-fb85542900fb","Type":"ContainerDied","Data":"e5e380e29c2d6cdefe875b298854cca2d1db3fba3d77ef6ea3722a376005fec5"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.706960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" event={"ID":"254d0a1b-ea0e-4d0c-8a20-fb85542900fb","Type":"ContainerStarted","Data":"8af14a426f70539b89c9d9e5f6084516ee18104c0c503d7fd938607fe613ac3f"} Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.710008 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7678b8b68b-mp4qf" podStartSLOduration=6.709989179 podStartE2EDuration="6.709989179s" podCreationTimestamp="2025-11-27 16:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:43.701498575 +0000 UTC m=+1079.332947333" watchObservedRunningTime="2025-11-27 16:21:43.709989179 +0000 UTC m=+1079.341437947" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.728984 4707 scope.go:117] "RemoveContainer" containerID="d81248bbf232c36a76ada9937a510da6bfacd6954797306990a4e5145d9c0e2c" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.805828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-config-data\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.806130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-scripts\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.806241 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.806270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq89q\" (UniqueName: \"kubernetes.io/projected/ca422061-61e0-4b99-91ed-adb502839a46-kube-api-access-dq89q\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.806342 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-log-httpd\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.806438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-run-httpd\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.806517 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.909418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-log-httpd\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.909478 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-run-httpd\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.909509 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.909549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-config-data\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.909568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-scripts\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.910886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.910919 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq89q\" (UniqueName: \"kubernetes.io/projected/ca422061-61e0-4b99-91ed-adb502839a46-kube-api-access-dq89q\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.910743 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-log-httpd\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.912095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-run-httpd\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.916566 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.917042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.918163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-scripts\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.919985 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-config-data\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:43 crc kubenswrapper[4707]: I1127 16:21:43.926228 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq89q\" (UniqueName: \"kubernetes.io/projected/ca422061-61e0-4b99-91ed-adb502839a46-kube-api-access-dq89q\") pod \"ceilometer-0\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " pod="openstack/ceilometer-0" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.071868 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7597fbc9fb-5l66n"] Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.076059 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.071875 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.093756 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-d77877bbc-l8p9h"] Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.094931 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.103845 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7597fbc9fb-5l66n"] Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.116260 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5f6bc5c6bb-tzktc"] Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.124212 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.127346 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-d77877bbc-l8p9h"] Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.140597 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f6bc5c6bb-tzktc"] Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216483 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-config-data\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216691 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data-custom\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216717 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-combined-ca-bundle\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-combined-ca-bundle\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq2lw\" (UniqueName: \"kubernetes.io/projected/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-kube-api-access-nq2lw\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216797 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-config-data-custom\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216821 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn8dh\" (UniqueName: \"kubernetes.io/projected/917b521a-96db-4475-bf1c-af43a99c67f1-kube-api-access-nn8dh\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216849 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-combined-ca-bundle\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216863 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216900 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216953 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txqd9\" (UniqueName: \"kubernetes.io/projected/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-kube-api-access-txqd9\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.216979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data-custom\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.320421 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-config-data-custom\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.320744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn8dh\" (UniqueName: \"kubernetes.io/projected/917b521a-96db-4475-bf1c-af43a99c67f1-kube-api-access-nn8dh\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.320896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-combined-ca-bundle\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.320917 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.320953 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.321009 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txqd9\" (UniqueName: \"kubernetes.io/projected/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-kube-api-access-txqd9\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.321033 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data-custom\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.321061 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-config-data\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.321078 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data-custom\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.321101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-combined-ca-bundle\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.321127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-combined-ca-bundle\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.321144 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq2lw\" (UniqueName: \"kubernetes.io/projected/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-kube-api-access-nq2lw\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.332195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data-custom\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.334020 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-config-data-custom\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.334351 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.336676 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-combined-ca-bundle\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.338226 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.339666 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data-custom\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.341808 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-combined-ca-bundle\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.344088 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-config-data\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.346612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txqd9\" (UniqueName: \"kubernetes.io/projected/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-kube-api-access-txqd9\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.347171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq2lw\" (UniqueName: \"kubernetes.io/projected/b92f69c5-3b78-463b-bb1a-7728d2cdb6ff-kube-api-access-nq2lw\") pod \"heat-engine-7597fbc9fb-5l66n\" (UID: \"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff\") " pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.347336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn8dh\" (UniqueName: \"kubernetes.io/projected/917b521a-96db-4475-bf1c-af43a99c67f1-kube-api-access-nn8dh\") pod \"heat-cfnapi-d77877bbc-l8p9h\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.347790 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-combined-ca-bundle\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.349431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data\") pod \"heat-api-5f6bc5c6bb-tzktc\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.401002 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.549704 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.563531 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.578447 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:21:44 crc kubenswrapper[4707]: W1127 16:21:44.592788 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca422061_61e0_4b99_91ed_adb502839a46.slice/crio-2a3eee1986799b8409ffbeac2724585f05a7c6cef9156be8dc5a73b9afb7f8cf WatchSource:0}: Error finding container 2a3eee1986799b8409ffbeac2724585f05a7c6cef9156be8dc5a73b9afb7f8cf: Status 404 returned error can't find the container with id 2a3eee1986799b8409ffbeac2724585f05a7c6cef9156be8dc5a73b9afb7f8cf Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.730629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" event={"ID":"254d0a1b-ea0e-4d0c-8a20-fb85542900fb","Type":"ContainerStarted","Data":"eb9134ff6ad850e3518772ea6559c227cd25c608be080fb41935b943c1cd446f"} Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.731902 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.733839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerStarted","Data":"2a3eee1986799b8409ffbeac2724585f05a7c6cef9156be8dc5a73b9afb7f8cf"} Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.758557 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" podStartSLOduration=7.758539688 podStartE2EDuration="7.758539688s" podCreationTimestamp="2025-11-27 16:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:44.748555068 +0000 UTC m=+1080.380003856" watchObservedRunningTime="2025-11-27 16:21:44.758539688 +0000 UTC m=+1080.389988456" Nov 27 16:21:44 crc kubenswrapper[4707]: I1127 16:21:44.897221 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7597fbc9fb-5l66n"] Nov 27 16:21:44 crc kubenswrapper[4707]: W1127 16:21:44.929773 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92f69c5_3b78_463b_bb1a_7728d2cdb6ff.slice/crio-e0e02b3997da7b41fbedd652f08721d5e50477ee219dcd79e3a16dc5ecbbe84f WatchSource:0}: Error finding container e0e02b3997da7b41fbedd652f08721d5e50477ee219dcd79e3a16dc5ecbbe84f: Status 404 returned error can't find the container with id e0e02b3997da7b41fbedd652f08721d5e50477ee219dcd79e3a16dc5ecbbe84f Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.107701 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-d77877bbc-l8p9h"] Nov 27 16:21:45 crc kubenswrapper[4707]: W1127 16:21:45.118315 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod917b521a_96db_4475_bf1c_af43a99c67f1.slice/crio-c0b9ede4336efb04aff27cc22b8fa62d74b676e2ae713b2b722cabd503846f6c WatchSource:0}: Error finding container c0b9ede4336efb04aff27cc22b8fa62d74b676e2ae713b2b722cabd503846f6c: Status 404 returned error can't find the container with id c0b9ede4336efb04aff27cc22b8fa62d74b676e2ae713b2b722cabd503846f6c Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.240599 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cd323d-9f2e-472f-8c75-9b413bbdb303" path="/var/lib/kubelet/pods/20cd323d-9f2e-472f-8c75-9b413bbdb303/volumes" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.264682 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f6bc5c6bb-tzktc"] Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.728438 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7db44cddd-l4rvd"] Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.743933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7597fbc9fb-5l66n" event={"ID":"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff","Type":"ContainerStarted","Data":"e0e02b3997da7b41fbedd652f08721d5e50477ee219dcd79e3a16dc5ecbbe84f"} Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.744280 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5584d95bf4-c6hwk"] Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.745869 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f6bc5c6bb-tzktc" event={"ID":"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a","Type":"ContainerStarted","Data":"3db1097466c184742fd8ec1ed22d8738be6460b0d568304e30ea83ba0933849e"} Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.747624 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" event={"ID":"917b521a-96db-4475-bf1c-af43a99c67f1","Type":"ContainerStarted","Data":"c0b9ede4336efb04aff27cc22b8fa62d74b676e2ae713b2b722cabd503846f6c"} Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.766671 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-57bc8fcfc9-trdbf"] Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.768108 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.770253 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.770891 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.784928 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5c4f76f9fb-ghh99"] Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.786332 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.797098 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-57bc8fcfc9-trdbf"] Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.797602 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.797751 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.805554 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5c4f76f9fb-ghh99"] Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.849855 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-internal-tls-certs\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.850041 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-config-data-custom\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.850088 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-combined-ca-bundle\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.850111 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-config-data\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.850157 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97tpj\" (UniqueName: \"kubernetes.io/projected/c001365b-7c18-4d58-b516-a038ef2d6c8c-kube-api-access-97tpj\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.850254 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-public-tls-certs\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.956274 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-config-data-custom\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.956329 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-config-data-custom\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.956379 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-combined-ca-bundle\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.956406 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-config-data\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.956430 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97tpj\" (UniqueName: \"kubernetes.io/projected/c001365b-7c18-4d58-b516-a038ef2d6c8c-kube-api-access-97tpj\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.956449 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-config-data\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.956480 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-public-tls-certs\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.956520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-public-tls-certs\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.957257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xhl2\" (UniqueName: \"kubernetes.io/projected/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-kube-api-access-7xhl2\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.957281 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-internal-tls-certs\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.957298 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-internal-tls-certs\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.957338 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-combined-ca-bundle\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.962541 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-public-tls-certs\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.962591 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-internal-tls-certs\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.964180 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-config-data-custom\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.976442 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-combined-ca-bundle\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.976528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c001365b-7c18-4d58-b516-a038ef2d6c8c-config-data\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:45 crc kubenswrapper[4707]: I1127 16:21:45.978089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97tpj\" (UniqueName: \"kubernetes.io/projected/c001365b-7c18-4d58-b516-a038ef2d6c8c-kube-api-access-97tpj\") pod \"heat-api-57bc8fcfc9-trdbf\" (UID: \"c001365b-7c18-4d58-b516-a038ef2d6c8c\") " pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.058763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-combined-ca-bundle\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.058835 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-config-data-custom\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.058886 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-config-data\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.058912 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-public-tls-certs\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.058975 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xhl2\" (UniqueName: \"kubernetes.io/projected/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-kube-api-access-7xhl2\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.058999 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-internal-tls-certs\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.062726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-internal-tls-certs\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.065022 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-combined-ca-bundle\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.065112 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-config-data-custom\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.066409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-config-data\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.075966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xhl2\" (UniqueName: \"kubernetes.io/projected/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-kube-api-access-7xhl2\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.076157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7646872e-82fb-4df8-b7ce-176b3ba7fe8a-public-tls-certs\") pod \"heat-cfnapi-5c4f76f9fb-ghh99\" (UID: \"7646872e-82fb-4df8-b7ce-176b3ba7fe8a\") " pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.086483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.104033 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.622136 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5c4f76f9fb-ghh99"] Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.630922 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-57bc8fcfc9-trdbf"] Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.757033 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7597fbc9fb-5l66n" event={"ID":"b92f69c5-3b78-463b-bb1a-7728d2cdb6ff","Type":"ContainerStarted","Data":"8d2e718b54128f92524019df5de294babfef848fe08690e0e40216c3e4741417"} Nov 27 16:21:46 crc kubenswrapper[4707]: I1127 16:21:46.777765 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7597fbc9fb-5l66n" podStartSLOduration=2.777744973 podStartE2EDuration="2.777744973s" podCreationTimestamp="2025-11-27 16:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:46.772996759 +0000 UTC m=+1082.404445527" watchObservedRunningTime="2025-11-27 16:21:46.777744973 +0000 UTC m=+1082.409193751" Nov 27 16:21:47 crc kubenswrapper[4707]: W1127 16:21:47.070286 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7646872e_82fb_4df8_b7ce_176b3ba7fe8a.slice/crio-3dd4318897ed258a0608a200b2ee92e51549b653b6b7a64cd04df6edce71c5d4 WatchSource:0}: Error finding container 3dd4318897ed258a0608a200b2ee92e51549b653b6b7a64cd04df6edce71c5d4: Status 404 returned error can't find the container with id 3dd4318897ed258a0608a200b2ee92e51549b653b6b7a64cd04df6edce71c5d4 Nov 27 16:21:47 crc kubenswrapper[4707]: W1127 16:21:47.073718 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc001365b_7c18_4d58_b516_a038ef2d6c8c.slice/crio-abedda55062c91968c9f5cb6f26795d65a67d7e11dbbc0110d86cec2cf6485c4 WatchSource:0}: Error finding container abedda55062c91968c9f5cb6f26795d65a67d7e11dbbc0110d86cec2cf6485c4: Status 404 returned error can't find the container with id abedda55062c91968c9f5cb6f26795d65a67d7e11dbbc0110d86cec2cf6485c4 Nov 27 16:21:47 crc kubenswrapper[4707]: I1127 16:21:47.768542 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" event={"ID":"7646872e-82fb-4df8-b7ce-176b3ba7fe8a","Type":"ContainerStarted","Data":"3dd4318897ed258a0608a200b2ee92e51549b653b6b7a64cd04df6edce71c5d4"} Nov 27 16:21:47 crc kubenswrapper[4707]: I1127 16:21:47.769953 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57bc8fcfc9-trdbf" event={"ID":"c001365b-7c18-4d58-b516-a038ef2d6c8c","Type":"ContainerStarted","Data":"abedda55062c91968c9f5cb6f26795d65a67d7e11dbbc0110d86cec2cf6485c4"} Nov 27 16:21:47 crc kubenswrapper[4707]: I1127 16:21:47.770115 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.053531 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.132479 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-t8jp6"] Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.132761 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" podUID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" containerName="dnsmasq-dns" containerID="cri-o://622e5ed3f0f124984e4d55cbfaa4e258b2bbb08221ab94c92cddb1927c240800" gracePeriod=10 Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.791319 4707 generic.go:334] "Generic (PLEG): container finished" podID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" containerID="622e5ed3f0f124984e4d55cbfaa4e258b2bbb08221ab94c92cddb1927c240800" exitCode=0 Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.791358 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" event={"ID":"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d","Type":"ContainerDied","Data":"622e5ed3f0f124984e4d55cbfaa4e258b2bbb08221ab94c92cddb1927c240800"} Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.791663 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" event={"ID":"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d","Type":"ContainerDied","Data":"8f3ab2408c96ad3596755d14026d95ab1c785ed09e8d748c2a1915def79901b7"} Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.791677 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f3ab2408c96ad3596755d14026d95ab1c785ed09e8d748c2a1915def79901b7" Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.825072 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.917069 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-sb\") pod \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.917135 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-config\") pod \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.917221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-svc\") pod \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.917311 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-nb\") pod \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.917350 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkcsq\" (UniqueName: \"kubernetes.io/projected/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-kube-api-access-fkcsq\") pod \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.917447 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-swift-storage-0\") pod \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " Nov 27 16:21:48 crc kubenswrapper[4707]: I1127 16:21:48.921420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-kube-api-access-fkcsq" (OuterVolumeSpecName: "kube-api-access-fkcsq") pod "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" (UID: "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d"). InnerVolumeSpecName "kube-api-access-fkcsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.002557 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-config" (OuterVolumeSpecName: "config") pod "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" (UID: "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.019000 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" (UID: "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.019223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-swift-storage-0\") pod \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\" (UID: \"3bc190a4-bb7f-4b8a-acc4-e201fd1d495d\") " Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.019709 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.019729 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkcsq\" (UniqueName: \"kubernetes.io/projected/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-kube-api-access-fkcsq\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:49 crc kubenswrapper[4707]: W1127 16:21:49.019810 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d/volumes/kubernetes.io~configmap/dns-swift-storage-0 Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.019830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" (UID: "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.040359 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" (UID: "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.082082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" (UID: "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.120934 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.120966 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.120979 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.138094 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" (UID: "3bc190a4-bb7f-4b8a-acc4-e201fd1d495d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.178939 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qd5lp"] Nov 27 16:21:49 crc kubenswrapper[4707]: E1127 16:21:49.179468 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" containerName="dnsmasq-dns" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.179489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" containerName="dnsmasq-dns" Nov 27 16:21:49 crc kubenswrapper[4707]: E1127 16:21:49.179514 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" containerName="init" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.179521 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" containerName="init" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.179712 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" containerName="dnsmasq-dns" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.180356 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.191974 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qd5lp"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.222264 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.285083 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rz6lr"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.286385 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.312424 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rz6lr"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.324211 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xcz\" (UniqueName: \"kubernetes.io/projected/63ce055f-5c4d-43ae-895c-4632afdacd87-kube-api-access-p4xcz\") pod \"nova-api-db-create-qd5lp\" (UID: \"63ce055f-5c4d-43ae-895c-4632afdacd87\") " pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.324289 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ce055f-5c4d-43ae-895c-4632afdacd87-operator-scripts\") pod \"nova-api-db-create-qd5lp\" (UID: \"63ce055f-5c4d-43ae-895c-4632afdacd87\") " pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.347952 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-54ea-account-create-update-lm8hw"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.366683 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-54ea-account-create-update-lm8hw"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.366777 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.368726 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.429105 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tmmj9"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.429409 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ce055f-5c4d-43ae-895c-4632afdacd87-operator-scripts\") pod \"nova-api-db-create-qd5lp\" (UID: \"63ce055f-5c4d-43ae-895c-4632afdacd87\") " pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.429518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca24dad-8ffd-41b5-9379-b05c90193e9e-operator-scripts\") pod \"nova-cell0-db-create-rz6lr\" (UID: \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\") " pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.429663 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsq7s\" (UniqueName: \"kubernetes.io/projected/c33b6312-639a-429c-88ae-5c60ec56280c-kube-api-access-fsq7s\") pod \"nova-api-54ea-account-create-update-lm8hw\" (UID: \"c33b6312-639a-429c-88ae-5c60ec56280c\") " pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.429746 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tvwt\" (UniqueName: \"kubernetes.io/projected/3ca24dad-8ffd-41b5-9379-b05c90193e9e-kube-api-access-5tvwt\") pod \"nova-cell0-db-create-rz6lr\" (UID: \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\") " pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.429902 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33b6312-639a-429c-88ae-5c60ec56280c-operator-scripts\") pod \"nova-api-54ea-account-create-update-lm8hw\" (UID: \"c33b6312-639a-429c-88ae-5c60ec56280c\") " pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.429945 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xcz\" (UniqueName: \"kubernetes.io/projected/63ce055f-5c4d-43ae-895c-4632afdacd87-kube-api-access-p4xcz\") pod \"nova-api-db-create-qd5lp\" (UID: \"63ce055f-5c4d-43ae-895c-4632afdacd87\") " pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.431074 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.432837 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ce055f-5c4d-43ae-895c-4632afdacd87-operator-scripts\") pod \"nova-api-db-create-qd5lp\" (UID: \"63ce055f-5c4d-43ae-895c-4632afdacd87\") " pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.461665 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xcz\" (UniqueName: \"kubernetes.io/projected/63ce055f-5c4d-43ae-895c-4632afdacd87-kube-api-access-p4xcz\") pod \"nova-api-db-create-qd5lp\" (UID: \"63ce055f-5c4d-43ae-895c-4632afdacd87\") " pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.461763 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tmmj9"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.491376 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b684-account-create-update-7n56j"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.492657 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.496924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b684-account-create-update-7n56j"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.503249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.530176 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.531189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-operator-scripts\") pod \"nova-cell1-db-create-tmmj9\" (UID: \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\") " pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.531236 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581592db-9e10-4a98-a03d-598ce54b0c74-operator-scripts\") pod \"nova-cell0-b684-account-create-update-7n56j\" (UID: \"581592db-9e10-4a98-a03d-598ce54b0c74\") " pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.531275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdwf5\" (UniqueName: \"kubernetes.io/projected/581592db-9e10-4a98-a03d-598ce54b0c74-kube-api-access-mdwf5\") pod \"nova-cell0-b684-account-create-update-7n56j\" (UID: \"581592db-9e10-4a98-a03d-598ce54b0c74\") " pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.531302 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33b6312-639a-429c-88ae-5c60ec56280c-operator-scripts\") pod \"nova-api-54ea-account-create-update-lm8hw\" (UID: \"c33b6312-639a-429c-88ae-5c60ec56280c\") " pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.531401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca24dad-8ffd-41b5-9379-b05c90193e9e-operator-scripts\") pod \"nova-cell0-db-create-rz6lr\" (UID: \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\") " pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.531439 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsq7s\" (UniqueName: \"kubernetes.io/projected/c33b6312-639a-429c-88ae-5c60ec56280c-kube-api-access-fsq7s\") pod \"nova-api-54ea-account-create-update-lm8hw\" (UID: \"c33b6312-639a-429c-88ae-5c60ec56280c\") " pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.531466 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktss8\" (UniqueName: \"kubernetes.io/projected/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-kube-api-access-ktss8\") pod \"nova-cell1-db-create-tmmj9\" (UID: \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\") " pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.531492 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tvwt\" (UniqueName: \"kubernetes.io/projected/3ca24dad-8ffd-41b5-9379-b05c90193e9e-kube-api-access-5tvwt\") pod \"nova-cell0-db-create-rz6lr\" (UID: \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\") " pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.533098 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33b6312-639a-429c-88ae-5c60ec56280c-operator-scripts\") pod \"nova-api-54ea-account-create-update-lm8hw\" (UID: \"c33b6312-639a-429c-88ae-5c60ec56280c\") " pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.533502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca24dad-8ffd-41b5-9379-b05c90193e9e-operator-scripts\") pod \"nova-cell0-db-create-rz6lr\" (UID: \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\") " pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.549993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tvwt\" (UniqueName: \"kubernetes.io/projected/3ca24dad-8ffd-41b5-9379-b05c90193e9e-kube-api-access-5tvwt\") pod \"nova-cell0-db-create-rz6lr\" (UID: \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\") " pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.550797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsq7s\" (UniqueName: \"kubernetes.io/projected/c33b6312-639a-429c-88ae-5c60ec56280c-kube-api-access-fsq7s\") pod \"nova-api-54ea-account-create-update-lm8hw\" (UID: \"c33b6312-639a-429c-88ae-5c60ec56280c\") " pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.613669 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.632695 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktss8\" (UniqueName: \"kubernetes.io/projected/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-kube-api-access-ktss8\") pod \"nova-cell1-db-create-tmmj9\" (UID: \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\") " pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.632746 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-operator-scripts\") pod \"nova-cell1-db-create-tmmj9\" (UID: \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\") " pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.632795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581592db-9e10-4a98-a03d-598ce54b0c74-operator-scripts\") pod \"nova-cell0-b684-account-create-update-7n56j\" (UID: \"581592db-9e10-4a98-a03d-598ce54b0c74\") " pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.632839 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdwf5\" (UniqueName: \"kubernetes.io/projected/581592db-9e10-4a98-a03d-598ce54b0c74-kube-api-access-mdwf5\") pod \"nova-cell0-b684-account-create-update-7n56j\" (UID: \"581592db-9e10-4a98-a03d-598ce54b0c74\") " pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.634095 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-operator-scripts\") pod \"nova-cell1-db-create-tmmj9\" (UID: \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\") " pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.635021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581592db-9e10-4a98-a03d-598ce54b0c74-operator-scripts\") pod \"nova-cell0-b684-account-create-update-7n56j\" (UID: \"581592db-9e10-4a98-a03d-598ce54b0c74\") " pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.650081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktss8\" (UniqueName: \"kubernetes.io/projected/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-kube-api-access-ktss8\") pod \"nova-cell1-db-create-tmmj9\" (UID: \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\") " pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.672438 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdwf5\" (UniqueName: \"kubernetes.io/projected/581592db-9e10-4a98-a03d-598ce54b0c74-kube-api-access-mdwf5\") pod \"nova-cell0-b684-account-create-update-7n56j\" (UID: \"581592db-9e10-4a98-a03d-598ce54b0c74\") " pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.691460 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9fba-account-create-update-bb8t5"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.694115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.696130 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.710748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9fba-account-create-update-bb8t5"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.736453 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlnmk\" (UniqueName: \"kubernetes.io/projected/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-kube-api-access-jlnmk\") pod \"nova-cell1-9fba-account-create-update-bb8t5\" (UID: \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\") " pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.736527 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-operator-scripts\") pod \"nova-cell1-9fba-account-create-update-bb8t5\" (UID: \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\") " pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.828031 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.832503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.840127 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlnmk\" (UniqueName: \"kubernetes.io/projected/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-kube-api-access-jlnmk\") pod \"nova-cell1-9fba-account-create-update-bb8t5\" (UID: \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\") " pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.840184 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-operator-scripts\") pod \"nova-cell1-9fba-account-create-update-bb8t5\" (UID: \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\") " pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.840995 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-operator-scripts\") pod \"nova-cell1-9fba-account-create-update-bb8t5\" (UID: \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\") " pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.845300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" event={"ID":"917b521a-96db-4475-bf1c-af43a99c67f1","Type":"ContainerStarted","Data":"2a13cf3e3db33c80b734354a2c4c3dd9ee4ee070ce62d3d7da06d9224afd406b"} Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.846662 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.852678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7db44cddd-l4rvd" event={"ID":"75326248-4957-4086-ad33-0a8c76ec7ff5","Type":"ContainerStarted","Data":"f6ed0b65fd1bb3f7cfed1acf04eae96d1456431c717f0f39cdfe7f83243b2e4d"} Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.852776 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7db44cddd-l4rvd" podUID="75326248-4957-4086-ad33-0a8c76ec7ff5" containerName="heat-api" containerID="cri-o://f6ed0b65fd1bb3f7cfed1acf04eae96d1456431c717f0f39cdfe7f83243b2e4d" gracePeriod=60 Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.852957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.861438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-t8jp6" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.861689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerStarted","Data":"5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb"} Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.865435 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.874660 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlnmk\" (UniqueName: \"kubernetes.io/projected/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-kube-api-access-jlnmk\") pod \"nova-cell1-9fba-account-create-update-bb8t5\" (UID: \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\") " pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.880590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.882770 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" podStartSLOduration=1.964655163 podStartE2EDuration="5.882760302s" podCreationTimestamp="2025-11-27 16:21:44 +0000 UTC" firstStartedPulling="2025-11-27 16:21:45.120292822 +0000 UTC m=+1080.751741590" lastFinishedPulling="2025-11-27 16:21:49.038397961 +0000 UTC m=+1084.669846729" observedRunningTime="2025-11-27 16:21:49.866968903 +0000 UTC m=+1085.498417671" watchObservedRunningTime="2025-11-27 16:21:49.882760302 +0000 UTC m=+1085.514209070" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.899432 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7db44cddd-l4rvd" podStartSLOduration=6.728582865 podStartE2EDuration="12.899409092s" podCreationTimestamp="2025-11-27 16:21:37 +0000 UTC" firstStartedPulling="2025-11-27 16:21:42.752609781 +0000 UTC m=+1078.384058549" lastFinishedPulling="2025-11-27 16:21:48.923436008 +0000 UTC m=+1084.554884776" observedRunningTime="2025-11-27 16:21:49.891619105 +0000 UTC m=+1085.523067873" watchObservedRunningTime="2025-11-27 16:21:49.899409092 +0000 UTC m=+1085.530857850" Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.921243 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-t8jp6"] Nov 27 16:21:49 crc kubenswrapper[4707]: I1127 16:21:49.970522 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-t8jp6"] Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.073736 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qd5lp"] Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.271949 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rz6lr"] Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.562258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b684-account-create-update-7n56j"] Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.630825 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tmmj9"] Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.811470 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-54ea-account-create-update-lm8hw"] Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.821390 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9fba-account-create-update-bb8t5"] Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.879386 4707 generic.go:334] "Generic (PLEG): container finished" podID="917b521a-96db-4475-bf1c-af43a99c67f1" containerID="2a13cf3e3db33c80b734354a2c4c3dd9ee4ee070ce62d3d7da06d9224afd406b" exitCode=1 Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.879442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" event={"ID":"917b521a-96db-4475-bf1c-af43a99c67f1","Type":"ContainerDied","Data":"2a13cf3e3db33c80b734354a2c4c3dd9ee4ee070ce62d3d7da06d9224afd406b"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.880941 4707 scope.go:117] "RemoveContainer" containerID="2a13cf3e3db33c80b734354a2c4c3dd9ee4ee070ce62d3d7da06d9224afd406b" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.890586 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qd5lp" event={"ID":"63ce055f-5c4d-43ae-895c-4632afdacd87","Type":"ContainerStarted","Data":"70748516c6edb297226908178a5f41325b0eaeb83dcfddeedf8cd3e0fe7c91d6"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.890642 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qd5lp" event={"ID":"63ce055f-5c4d-43ae-895c-4632afdacd87","Type":"ContainerStarted","Data":"cc3e71a8e207726cdddf36a71d99de7599f67c8ea362188ff8cb1c181aa510c1"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.896696 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-57bc8fcfc9-trdbf" event={"ID":"c001365b-7c18-4d58-b516-a038ef2d6c8c","Type":"ContainerStarted","Data":"76201bf01e7cff16c4c7c6677c3cdc295210894db72da91cd8f68347065876d8"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.896796 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.898504 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tmmj9" event={"ID":"54b3a42f-b2e8-46ae-b500-3b2de0b501c7","Type":"ContainerStarted","Data":"e6b3d28f786ae9b54f7b4880429c44aed08634574c221855455fc392a23e1285"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.902452 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rz6lr" event={"ID":"3ca24dad-8ffd-41b5-9379-b05c90193e9e","Type":"ContainerStarted","Data":"18aced7ffc8c3afdb0f5d4df0d9b04e688a1696a570ff8576f5c3b4fe2b0968f"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.902515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rz6lr" event={"ID":"3ca24dad-8ffd-41b5-9379-b05c90193e9e","Type":"ContainerStarted","Data":"f70b864392411ee8cff428c2526e1fcac85cf5fae9bcd2cabddb164e32a44e21"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.907522 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" event={"ID":"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395","Type":"ContainerStarted","Data":"918a566a2bc035ddf7b0f93e591bb93ed154bb1b3cb182b0a59d1f650b513233"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.909145 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b684-account-create-update-7n56j" event={"ID":"581592db-9e10-4a98-a03d-598ce54b0c74","Type":"ContainerStarted","Data":"0c78140bf870abb98eb2d97dbdff01a25d54b67d31dc5517a9b58857afa407a0"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.911429 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-qd5lp" podStartSLOduration=1.9114032619999999 podStartE2EDuration="1.911403262s" podCreationTimestamp="2025-11-27 16:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:50.904884666 +0000 UTC m=+1086.536333434" watchObservedRunningTime="2025-11-27 16:21:50.911403262 +0000 UTC m=+1086.542852050" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.915839 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerStarted","Data":"e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.918347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" event={"ID":"7646872e-82fb-4df8-b7ce-176b3ba7fe8a","Type":"ContainerStarted","Data":"ed452cfb9156da81aee11ba3ae7ccb27ef93675628cdb5ffa72365e36c6324d3"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.919284 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.920480 4707 generic.go:334] "Generic (PLEG): container finished" podID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" containerID="e4dc9daccd6f1ebc55ae9dc15f480da51d657fda9cb0f48f1273dd5d626815b5" exitCode=1 Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.920558 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f6bc5c6bb-tzktc" event={"ID":"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a","Type":"ContainerDied","Data":"e4dc9daccd6f1ebc55ae9dc15f480da51d657fda9cb0f48f1273dd5d626815b5"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.921184 4707 scope.go:117] "RemoveContainer" containerID="e4dc9daccd6f1ebc55ae9dc15f480da51d657fda9cb0f48f1273dd5d626815b5" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.922473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" event={"ID":"2e50d632-6bfc-48aa-ab32-f0a05105b482","Type":"ContainerStarted","Data":"415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.926496 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" podUID="2e50d632-6bfc-48aa-ab32-f0a05105b482" containerName="heat-cfnapi" containerID="cri-o://415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c" gracePeriod=60 Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.926656 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.931631 4707 generic.go:334] "Generic (PLEG): container finished" podID="75326248-4957-4086-ad33-0a8c76ec7ff5" containerID="f6ed0b65fd1bb3f7cfed1acf04eae96d1456431c717f0f39cdfe7f83243b2e4d" exitCode=0 Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.931687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7db44cddd-l4rvd" event={"ID":"75326248-4957-4086-ad33-0a8c76ec7ff5","Type":"ContainerDied","Data":"f6ed0b65fd1bb3f7cfed1acf04eae96d1456431c717f0f39cdfe7f83243b2e4d"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.931707 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7db44cddd-l4rvd" event={"ID":"75326248-4957-4086-ad33-0a8c76ec7ff5","Type":"ContainerDied","Data":"e0f3a4859e733cdff613c48694b31901fc474c899362269725bb6692014f31d2"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.931718 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0f3a4859e733cdff613c48694b31901fc474c899362269725bb6692014f31d2" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.934235 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-rz6lr" podStartSLOduration=1.934213471 podStartE2EDuration="1.934213471s" podCreationTimestamp="2025-11-27 16:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:21:50.916557466 +0000 UTC m=+1086.548006234" watchObservedRunningTime="2025-11-27 16:21:50.934213471 +0000 UTC m=+1086.565662229" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.955346 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-54ea-account-create-update-lm8hw" event={"ID":"c33b6312-639a-429c-88ae-5c60ec56280c","Type":"ContainerStarted","Data":"825110384d22f48a32cf0d330dcec275cf833daaeb52a9172dd90f760d3bc56c"} Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.959770 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.973114 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-57bc8fcfc9-trdbf" podStartSLOduration=3.988239065 podStartE2EDuration="5.973091935s" podCreationTimestamp="2025-11-27 16:21:45 +0000 UTC" firstStartedPulling="2025-11-27 16:21:47.191708111 +0000 UTC m=+1082.823156909" lastFinishedPulling="2025-11-27 16:21:49.176561011 +0000 UTC m=+1084.808009779" observedRunningTime="2025-11-27 16:21:50.946066426 +0000 UTC m=+1086.577515194" watchObservedRunningTime="2025-11-27 16:21:50.973091935 +0000 UTC m=+1086.604540703" Nov 27 16:21:50 crc kubenswrapper[4707]: I1127 16:21:50.995396 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" podStartSLOduration=4.12078608 podStartE2EDuration="5.99536212s" podCreationTimestamp="2025-11-27 16:21:45 +0000 UTC" firstStartedPulling="2025-11-27 16:21:47.191697081 +0000 UTC m=+1082.823145889" lastFinishedPulling="2025-11-27 16:21:49.066273171 +0000 UTC m=+1084.697721929" observedRunningTime="2025-11-27 16:21:50.993345442 +0000 UTC m=+1086.624794220" watchObservedRunningTime="2025-11-27 16:21:50.99536212 +0000 UTC m=+1086.626810888" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.020403 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" podStartSLOduration=7.951698199 podStartE2EDuration="14.020362221s" podCreationTimestamp="2025-11-27 16:21:37 +0000 UTC" firstStartedPulling="2025-11-27 16:21:42.957395873 +0000 UTC m=+1078.588844641" lastFinishedPulling="2025-11-27 16:21:49.026059895 +0000 UTC m=+1084.657508663" observedRunningTime="2025-11-27 16:21:51.010507874 +0000 UTC m=+1086.641956642" watchObservedRunningTime="2025-11-27 16:21:51.020362221 +0000 UTC m=+1086.651810989" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.104298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data-custom\") pod \"75326248-4957-4086-ad33-0a8c76ec7ff5\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.104390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-combined-ca-bundle\") pod \"75326248-4957-4086-ad33-0a8c76ec7ff5\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.104494 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42q7j\" (UniqueName: \"kubernetes.io/projected/75326248-4957-4086-ad33-0a8c76ec7ff5-kube-api-access-42q7j\") pod \"75326248-4957-4086-ad33-0a8c76ec7ff5\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.104560 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data\") pod \"75326248-4957-4086-ad33-0a8c76ec7ff5\" (UID: \"75326248-4957-4086-ad33-0a8c76ec7ff5\") " Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.130712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75326248-4957-4086-ad33-0a8c76ec7ff5-kube-api-access-42q7j" (OuterVolumeSpecName: "kube-api-access-42q7j") pod "75326248-4957-4086-ad33-0a8c76ec7ff5" (UID: "75326248-4957-4086-ad33-0a8c76ec7ff5"). InnerVolumeSpecName "kube-api-access-42q7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.131009 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "75326248-4957-4086-ad33-0a8c76ec7ff5" (UID: "75326248-4957-4086-ad33-0a8c76ec7ff5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.206572 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc190a4-bb7f-4b8a-acc4-e201fd1d495d" path="/var/lib/kubelet/pods/3bc190a4-bb7f-4b8a-acc4-e201fd1d495d/volumes" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.208558 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42q7j\" (UniqueName: \"kubernetes.io/projected/75326248-4957-4086-ad33-0a8c76ec7ff5-kube-api-access-42q7j\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.208592 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.362752 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75326248-4957-4086-ad33-0a8c76ec7ff5" (UID: "75326248-4957-4086-ad33-0a8c76ec7ff5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.412510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data" (OuterVolumeSpecName: "config-data") pod "75326248-4957-4086-ad33-0a8c76ec7ff5" (UID: "75326248-4957-4086-ad33-0a8c76ec7ff5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.412812 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.412896 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75326248-4957-4086-ad33-0a8c76ec7ff5-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.634860 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.740122 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-combined-ca-bundle\") pod \"2e50d632-6bfc-48aa-ab32-f0a05105b482\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.740563 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data-custom\") pod \"2e50d632-6bfc-48aa-ab32-f0a05105b482\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.740605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rvlp\" (UniqueName: \"kubernetes.io/projected/2e50d632-6bfc-48aa-ab32-f0a05105b482-kube-api-access-7rvlp\") pod \"2e50d632-6bfc-48aa-ab32-f0a05105b482\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.740832 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data\") pod \"2e50d632-6bfc-48aa-ab32-f0a05105b482\" (UID: \"2e50d632-6bfc-48aa-ab32-f0a05105b482\") " Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.778599 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2e50d632-6bfc-48aa-ab32-f0a05105b482" (UID: "2e50d632-6bfc-48aa-ab32-f0a05105b482"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.778801 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e50d632-6bfc-48aa-ab32-f0a05105b482-kube-api-access-7rvlp" (OuterVolumeSpecName: "kube-api-access-7rvlp") pod "2e50d632-6bfc-48aa-ab32-f0a05105b482" (UID: "2e50d632-6bfc-48aa-ab32-f0a05105b482"). InnerVolumeSpecName "kube-api-access-7rvlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.781579 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e50d632-6bfc-48aa-ab32-f0a05105b482" (UID: "2e50d632-6bfc-48aa-ab32-f0a05105b482"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.843092 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.843117 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.843126 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rvlp\" (UniqueName: \"kubernetes.io/projected/2e50d632-6bfc-48aa-ab32-f0a05105b482-kube-api-access-7rvlp\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.871656 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data" (OuterVolumeSpecName: "config-data") pod "2e50d632-6bfc-48aa-ab32-f0a05105b482" (UID: "2e50d632-6bfc-48aa-ab32-f0a05105b482"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.945446 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e50d632-6bfc-48aa-ab32-f0a05105b482-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.966749 4707 generic.go:334] "Generic (PLEG): container finished" podID="581592db-9e10-4a98-a03d-598ce54b0c74" containerID="cf31e44bf7ed767adb9470ab61b062790ba67711c1575d013ef009d2a36f11b1" exitCode=0 Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.966822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b684-account-create-update-7n56j" event={"ID":"581592db-9e10-4a98-a03d-598ce54b0c74","Type":"ContainerDied","Data":"cf31e44bf7ed767adb9470ab61b062790ba67711c1575d013ef009d2a36f11b1"} Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.968552 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerStarted","Data":"285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586"} Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.970945 4707 generic.go:334] "Generic (PLEG): container finished" podID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" containerID="f59b00ce961883b147cbc749460a6f9f27775445cd8d192484e6da565de13ce3" exitCode=1 Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.971014 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f6bc5c6bb-tzktc" event={"ID":"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a","Type":"ContainerDied","Data":"f59b00ce961883b147cbc749460a6f9f27775445cd8d192484e6da565de13ce3"} Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.971049 4707 scope.go:117] "RemoveContainer" containerID="e4dc9daccd6f1ebc55ae9dc15f480da51d657fda9cb0f48f1273dd5d626815b5" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.971815 4707 scope.go:117] "RemoveContainer" containerID="f59b00ce961883b147cbc749460a6f9f27775445cd8d192484e6da565de13ce3" Nov 27 16:21:51 crc kubenswrapper[4707]: E1127 16:21:51.972133 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5f6bc5c6bb-tzktc_openstack(8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a)\"" pod="openstack/heat-api-5f6bc5c6bb-tzktc" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.979002 4707 generic.go:334] "Generic (PLEG): container finished" podID="63ce055f-5c4d-43ae-895c-4632afdacd87" containerID="70748516c6edb297226908178a5f41325b0eaeb83dcfddeedf8cd3e0fe7c91d6" exitCode=0 Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.979050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qd5lp" event={"ID":"63ce055f-5c4d-43ae-895c-4632afdacd87","Type":"ContainerDied","Data":"70748516c6edb297226908178a5f41325b0eaeb83dcfddeedf8cd3e0fe7c91d6"} Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.980915 4707 generic.go:334] "Generic (PLEG): container finished" podID="3ca24dad-8ffd-41b5-9379-b05c90193e9e" containerID="18aced7ffc8c3afdb0f5d4df0d9b04e688a1696a570ff8576f5c3b4fe2b0968f" exitCode=0 Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.980978 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rz6lr" event={"ID":"3ca24dad-8ffd-41b5-9379-b05c90193e9e","Type":"ContainerDied","Data":"18aced7ffc8c3afdb0f5d4df0d9b04e688a1696a570ff8576f5c3b4fe2b0968f"} Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.982797 4707 generic.go:334] "Generic (PLEG): container finished" podID="c33b6312-639a-429c-88ae-5c60ec56280c" containerID="51971a8cac2fe3fe988afa9603a96e0609318f38a80b8c9d39888e823d1f3ea3" exitCode=0 Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.984460 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-54ea-account-create-update-lm8hw" event={"ID":"c33b6312-639a-429c-88ae-5c60ec56280c","Type":"ContainerDied","Data":"51971a8cac2fe3fe988afa9603a96e0609318f38a80b8c9d39888e823d1f3ea3"} Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.991945 4707 generic.go:334] "Generic (PLEG): container finished" podID="b6a68c0c-cdcb-4e30-bb21-051e9bdbf395" containerID="e9df8dd031cc4dac2d9ab317241860be6d935a1ea342b1f2a92a31676e2cf3b1" exitCode=0 Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.992009 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" event={"ID":"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395","Type":"ContainerDied","Data":"e9df8dd031cc4dac2d9ab317241860be6d935a1ea342b1f2a92a31676e2cf3b1"} Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.994517 4707 generic.go:334] "Generic (PLEG): container finished" podID="54b3a42f-b2e8-46ae-b500-3b2de0b501c7" containerID="ebc874000223527b4eeb598365e5b96ae24efcb1aed3fa9e2f28aca453f514c6" exitCode=0 Nov 27 16:21:51 crc kubenswrapper[4707]: I1127 16:21:51.994602 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tmmj9" event={"ID":"54b3a42f-b2e8-46ae-b500-3b2de0b501c7","Type":"ContainerDied","Data":"ebc874000223527b4eeb598365e5b96ae24efcb1aed3fa9e2f28aca453f514c6"} Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:51.999312 4707 generic.go:334] "Generic (PLEG): container finished" podID="917b521a-96db-4475-bf1c-af43a99c67f1" containerID="3779ec49760e720269224277194d444cdf1a0d2b42ee86a1b8c894c5dc8205dd" exitCode=1 Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.000744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" event={"ID":"917b521a-96db-4475-bf1c-af43a99c67f1","Type":"ContainerDied","Data":"3779ec49760e720269224277194d444cdf1a0d2b42ee86a1b8c894c5dc8205dd"} Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.001439 4707 scope.go:117] "RemoveContainer" containerID="3779ec49760e720269224277194d444cdf1a0d2b42ee86a1b8c894c5dc8205dd" Nov 27 16:21:52 crc kubenswrapper[4707]: E1127 16:21:52.001726 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-d77877bbc-l8p9h_openstack(917b521a-96db-4475-bf1c-af43a99c67f1)\"" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.004682 4707 generic.go:334] "Generic (PLEG): container finished" podID="2e50d632-6bfc-48aa-ab32-f0a05105b482" containerID="415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c" exitCode=0 Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.004747 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" event={"ID":"2e50d632-6bfc-48aa-ab32-f0a05105b482","Type":"ContainerDied","Data":"415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c"} Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.004822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" event={"ID":"2e50d632-6bfc-48aa-ab32-f0a05105b482","Type":"ContainerDied","Data":"d8ecab08fcc00ebc6c938a2e7eaf8c560af64bdb8d4e8acc69e5cb6e416337f3"} Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.004774 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5584d95bf4-c6hwk" Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.004881 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7db44cddd-l4rvd" Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.168568 4707 scope.go:117] "RemoveContainer" containerID="2a13cf3e3db33c80b734354a2c4c3dd9ee4ee070ce62d3d7da06d9224afd406b" Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.187059 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7db44cddd-l4rvd"] Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.195763 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7db44cddd-l4rvd"] Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.205706 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5584d95bf4-c6hwk"] Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.216381 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5584d95bf4-c6hwk"] Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.217625 4707 scope.go:117] "RemoveContainer" containerID="415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c" Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.251990 4707 scope.go:117] "RemoveContainer" containerID="415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c" Nov 27 16:21:52 crc kubenswrapper[4707]: E1127 16:21:52.252546 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c\": container with ID starting with 415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c not found: ID does not exist" containerID="415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c" Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.252667 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c"} err="failed to get container status \"415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c\": rpc error: code = NotFound desc = could not find container \"415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c\": container with ID starting with 415e1c7b4663ae6066b11a40ba296ccb5380484981bdfeebee8d346711c41d3c not found: ID does not exist" Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.646679 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:52 crc kubenswrapper[4707]: I1127 16:21:52.648619 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5475fd4f89-8stjv" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.040385 4707 scope.go:117] "RemoveContainer" containerID="f59b00ce961883b147cbc749460a6f9f27775445cd8d192484e6da565de13ce3" Nov 27 16:21:53 crc kubenswrapper[4707]: E1127 16:21:53.040598 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5f6bc5c6bb-tzktc_openstack(8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a)\"" pod="openstack/heat-api-5f6bc5c6bb-tzktc" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.043876 4707 scope.go:117] "RemoveContainer" containerID="3779ec49760e720269224277194d444cdf1a0d2b42ee86a1b8c894c5dc8205dd" Nov 27 16:21:53 crc kubenswrapper[4707]: E1127 16:21:53.044191 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-d77877bbc-l8p9h_openstack(917b521a-96db-4475-bf1c-af43a99c67f1)\"" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.220672 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e50d632-6bfc-48aa-ab32-f0a05105b482" path="/var/lib/kubelet/pods/2e50d632-6bfc-48aa-ab32-f0a05105b482/volumes" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.221996 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75326248-4957-4086-ad33-0a8c76ec7ff5" path="/var/lib/kubelet/pods/75326248-4957-4086-ad33-0a8c76ec7ff5/volumes" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.486822 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.595190 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdwf5\" (UniqueName: \"kubernetes.io/projected/581592db-9e10-4a98-a03d-598ce54b0c74-kube-api-access-mdwf5\") pod \"581592db-9e10-4a98-a03d-598ce54b0c74\" (UID: \"581592db-9e10-4a98-a03d-598ce54b0c74\") " Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.595516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581592db-9e10-4a98-a03d-598ce54b0c74-operator-scripts\") pod \"581592db-9e10-4a98-a03d-598ce54b0c74\" (UID: \"581592db-9e10-4a98-a03d-598ce54b0c74\") " Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.596252 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581592db-9e10-4a98-a03d-598ce54b0c74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "581592db-9e10-4a98-a03d-598ce54b0c74" (UID: "581592db-9e10-4a98-a03d-598ce54b0c74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.613400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581592db-9e10-4a98-a03d-598ce54b0c74-kube-api-access-mdwf5" (OuterVolumeSpecName: "kube-api-access-mdwf5") pod "581592db-9e10-4a98-a03d-598ce54b0c74" (UID: "581592db-9e10-4a98-a03d-598ce54b0c74"). InnerVolumeSpecName "kube-api-access-mdwf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.697605 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581592db-9e10-4a98-a03d-598ce54b0c74-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.697633 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdwf5\" (UniqueName: \"kubernetes.io/projected/581592db-9e10-4a98-a03d-598ce54b0c74-kube-api-access-mdwf5\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.799971 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.807520 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.825062 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.828437 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.845322 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.900630 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktss8\" (UniqueName: \"kubernetes.io/projected/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-kube-api-access-ktss8\") pod \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\" (UID: \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\") " Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.900698 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-operator-scripts\") pod \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\" (UID: \"54b3a42f-b2e8-46ae-b500-3b2de0b501c7\") " Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.900755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsq7s\" (UniqueName: \"kubernetes.io/projected/c33b6312-639a-429c-88ae-5c60ec56280c-kube-api-access-fsq7s\") pod \"c33b6312-639a-429c-88ae-5c60ec56280c\" (UID: \"c33b6312-639a-429c-88ae-5c60ec56280c\") " Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.900806 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33b6312-639a-429c-88ae-5c60ec56280c-operator-scripts\") pod \"c33b6312-639a-429c-88ae-5c60ec56280c\" (UID: \"c33b6312-639a-429c-88ae-5c60ec56280c\") " Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.901161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54b3a42f-b2e8-46ae-b500-3b2de0b501c7" (UID: "54b3a42f-b2e8-46ae-b500-3b2de0b501c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.901605 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33b6312-639a-429c-88ae-5c60ec56280c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c33b6312-639a-429c-88ae-5c60ec56280c" (UID: "c33b6312-639a-429c-88ae-5c60ec56280c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.901677 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.905540 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33b6312-639a-429c-88ae-5c60ec56280c-kube-api-access-fsq7s" (OuterVolumeSpecName: "kube-api-access-fsq7s") pod "c33b6312-639a-429c-88ae-5c60ec56280c" (UID: "c33b6312-639a-429c-88ae-5c60ec56280c"). InnerVolumeSpecName "kube-api-access-fsq7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:53 crc kubenswrapper[4707]: I1127 16:21:53.905599 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-kube-api-access-ktss8" (OuterVolumeSpecName: "kube-api-access-ktss8") pod "54b3a42f-b2e8-46ae-b500-3b2de0b501c7" (UID: "54b3a42f-b2e8-46ae-b500-3b2de0b501c7"). InnerVolumeSpecName "kube-api-access-ktss8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.002902 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4xcz\" (UniqueName: \"kubernetes.io/projected/63ce055f-5c4d-43ae-895c-4632afdacd87-kube-api-access-p4xcz\") pod \"63ce055f-5c4d-43ae-895c-4632afdacd87\" (UID: \"63ce055f-5c4d-43ae-895c-4632afdacd87\") " Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003003 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tvwt\" (UniqueName: \"kubernetes.io/projected/3ca24dad-8ffd-41b5-9379-b05c90193e9e-kube-api-access-5tvwt\") pod \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\" (UID: \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\") " Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003040 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ce055f-5c4d-43ae-895c-4632afdacd87-operator-scripts\") pod \"63ce055f-5c4d-43ae-895c-4632afdacd87\" (UID: \"63ce055f-5c4d-43ae-895c-4632afdacd87\") " Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003099 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca24dad-8ffd-41b5-9379-b05c90193e9e-operator-scripts\") pod \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\" (UID: \"3ca24dad-8ffd-41b5-9379-b05c90193e9e\") " Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003150 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlnmk\" (UniqueName: \"kubernetes.io/projected/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-kube-api-access-jlnmk\") pod \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\" (UID: \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\") " Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003181 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-operator-scripts\") pod \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\" (UID: \"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395\") " Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003545 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktss8\" (UniqueName: \"kubernetes.io/projected/54b3a42f-b2e8-46ae-b500-3b2de0b501c7-kube-api-access-ktss8\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003573 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsq7s\" (UniqueName: \"kubernetes.io/projected/c33b6312-639a-429c-88ae-5c60ec56280c-kube-api-access-fsq7s\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003583 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c33b6312-639a-429c-88ae-5c60ec56280c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.003803 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ce055f-5c4d-43ae-895c-4632afdacd87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63ce055f-5c4d-43ae-895c-4632afdacd87" (UID: "63ce055f-5c4d-43ae-895c-4632afdacd87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.004309 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ca24dad-8ffd-41b5-9379-b05c90193e9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ca24dad-8ffd-41b5-9379-b05c90193e9e" (UID: "3ca24dad-8ffd-41b5-9379-b05c90193e9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.004647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6a68c0c-cdcb-4e30-bb21-051e9bdbf395" (UID: "b6a68c0c-cdcb-4e30-bb21-051e9bdbf395"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.007250 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca24dad-8ffd-41b5-9379-b05c90193e9e-kube-api-access-5tvwt" (OuterVolumeSpecName: "kube-api-access-5tvwt") pod "3ca24dad-8ffd-41b5-9379-b05c90193e9e" (UID: "3ca24dad-8ffd-41b5-9379-b05c90193e9e"). InnerVolumeSpecName "kube-api-access-5tvwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.007312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ce055f-5c4d-43ae-895c-4632afdacd87-kube-api-access-p4xcz" (OuterVolumeSpecName: "kube-api-access-p4xcz") pod "63ce055f-5c4d-43ae-895c-4632afdacd87" (UID: "63ce055f-5c4d-43ae-895c-4632afdacd87"). InnerVolumeSpecName "kube-api-access-p4xcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.009346 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-kube-api-access-jlnmk" (OuterVolumeSpecName: "kube-api-access-jlnmk") pod "b6a68c0c-cdcb-4e30-bb21-051e9bdbf395" (UID: "b6a68c0c-cdcb-4e30-bb21-051e9bdbf395"). InnerVolumeSpecName "kube-api-access-jlnmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.053435 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tmmj9" event={"ID":"54b3a42f-b2e8-46ae-b500-3b2de0b501c7","Type":"ContainerDied","Data":"e6b3d28f786ae9b54f7b4880429c44aed08634574c221855455fc392a23e1285"} Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.053462 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tmmj9" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.053473 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6b3d28f786ae9b54f7b4880429c44aed08634574c221855455fc392a23e1285" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.054530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rz6lr" event={"ID":"3ca24dad-8ffd-41b5-9379-b05c90193e9e","Type":"ContainerDied","Data":"f70b864392411ee8cff428c2526e1fcac85cf5fae9bcd2cabddb164e32a44e21"} Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.054546 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f70b864392411ee8cff428c2526e1fcac85cf5fae9bcd2cabddb164e32a44e21" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.054604 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rz6lr" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.056450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-54ea-account-create-update-lm8hw" event={"ID":"c33b6312-639a-429c-88ae-5c60ec56280c","Type":"ContainerDied","Data":"825110384d22f48a32cf0d330dcec275cf833daaeb52a9172dd90f760d3bc56c"} Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.056480 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="825110384d22f48a32cf0d330dcec275cf833daaeb52a9172dd90f760d3bc56c" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.056480 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-54ea-account-create-update-lm8hw" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.058981 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" event={"ID":"b6a68c0c-cdcb-4e30-bb21-051e9bdbf395","Type":"ContainerDied","Data":"918a566a2bc035ddf7b0f93e591bb93ed154bb1b3cb182b0a59d1f650b513233"} Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.059017 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="918a566a2bc035ddf7b0f93e591bb93ed154bb1b3cb182b0a59d1f650b513233" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.059001 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9fba-account-create-update-bb8t5" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.060159 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b684-account-create-update-7n56j" event={"ID":"581592db-9e10-4a98-a03d-598ce54b0c74","Type":"ContainerDied","Data":"0c78140bf870abb98eb2d97dbdff01a25d54b67d31dc5517a9b58857afa407a0"} Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.060181 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c78140bf870abb98eb2d97dbdff01a25d54b67d31dc5517a9b58857afa407a0" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.060218 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b684-account-create-update-7n56j" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.070505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerStarted","Data":"e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb"} Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.070644 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="ceilometer-central-agent" containerID="cri-o://5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb" gracePeriod=30 Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.070858 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.070970 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="proxy-httpd" containerID="cri-o://e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb" gracePeriod=30 Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.071064 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="sg-core" containerID="cri-o://285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586" gracePeriod=30 Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.071070 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="ceilometer-notification-agent" containerID="cri-o://e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d" gracePeriod=30 Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.080590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qd5lp" event={"ID":"63ce055f-5c4d-43ae-895c-4632afdacd87","Type":"ContainerDied","Data":"cc3e71a8e207726cdddf36a71d99de7599f67c8ea362188ff8cb1c181aa510c1"} Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.080615 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc3e71a8e207726cdddf36a71d99de7599f67c8ea362188ff8cb1c181aa510c1" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.080659 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qd5lp" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.092707 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6480466959999998 podStartE2EDuration="11.092689275s" podCreationTimestamp="2025-11-27 16:21:43 +0000 UTC" firstStartedPulling="2025-11-27 16:21:44.638977355 +0000 UTC m=+1080.270426123" lastFinishedPulling="2025-11-27 16:21:53.083619934 +0000 UTC m=+1088.715068702" observedRunningTime="2025-11-27 16:21:54.09083541 +0000 UTC m=+1089.722284178" watchObservedRunningTime="2025-11-27 16:21:54.092689275 +0000 UTC m=+1089.724138043" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.108090 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4xcz\" (UniqueName: \"kubernetes.io/projected/63ce055f-5c4d-43ae-895c-4632afdacd87-kube-api-access-p4xcz\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.108116 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tvwt\" (UniqueName: \"kubernetes.io/projected/3ca24dad-8ffd-41b5-9379-b05c90193e9e-kube-api-access-5tvwt\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.108126 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ce055f-5c4d-43ae-895c-4632afdacd87-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.108135 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ca24dad-8ffd-41b5-9379-b05c90193e9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.108143 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlnmk\" (UniqueName: \"kubernetes.io/projected/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-kube-api-access-jlnmk\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.108151 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.549951 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.550265 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.550924 4707 scope.go:117] "RemoveContainer" containerID="3779ec49760e720269224277194d444cdf1a0d2b42ee86a1b8c894c5dc8205dd" Nov 27 16:21:54 crc kubenswrapper[4707]: E1127 16:21:54.551216 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-d77877bbc-l8p9h_openstack(917b521a-96db-4475-bf1c-af43a99c67f1)\"" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.564179 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.564720 4707 scope.go:117] "RemoveContainer" containerID="f59b00ce961883b147cbc749460a6f9f27775445cd8d192484e6da565de13ce3" Nov 27 16:21:54 crc kubenswrapper[4707]: E1127 16:21:54.564896 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5f6bc5c6bb-tzktc_openstack(8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a)\"" pod="openstack/heat-api-5f6bc5c6bb-tzktc" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" Nov 27 16:21:54 crc kubenswrapper[4707]: I1127 16:21:54.565138 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.093445 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca422061-61e0-4b99-91ed-adb502839a46" containerID="e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb" exitCode=0 Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.093501 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca422061-61e0-4b99-91ed-adb502839a46" containerID="285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586" exitCode=2 Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.093518 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca422061-61e0-4b99-91ed-adb502839a46" containerID="e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d" exitCode=0 Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.093508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerDied","Data":"e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb"} Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.093577 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerDied","Data":"285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586"} Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.093590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerDied","Data":"e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d"} Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.094330 4707 scope.go:117] "RemoveContainer" containerID="f59b00ce961883b147cbc749460a6f9f27775445cd8d192484e6da565de13ce3" Nov 27 16:21:55 crc kubenswrapper[4707]: E1127 16:21:55.094788 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5f6bc5c6bb-tzktc_openstack(8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a)\"" pod="openstack/heat-api-5f6bc5c6bb-tzktc" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.706193 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.706435 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerName="glance-log" containerID="cri-o://81d2c1c2557632dd94e53405c847f196729c846144d2ec8a8ef1580865599241" gracePeriod=30 Nov 27 16:21:55 crc kubenswrapper[4707]: I1127 16:21:55.706515 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerName="glance-httpd" containerID="cri-o://62bdd03ab195ad3612cdfb77283c1c12d9f5af412bb3d68830b00c6e0c23d85c" gracePeriod=30 Nov 27 16:21:56 crc kubenswrapper[4707]: I1127 16:21:56.101708 4707 generic.go:334] "Generic (PLEG): container finished" podID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerID="81d2c1c2557632dd94e53405c847f196729c846144d2ec8a8ef1580865599241" exitCode=143 Nov 27 16:21:56 crc kubenswrapper[4707]: I1127 16:21:56.101789 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5798d94c-5634-4d92-b36d-cab9231adfc4","Type":"ContainerDied","Data":"81d2c1c2557632dd94e53405c847f196729c846144d2ec8a8ef1580865599241"} Nov 27 16:21:56 crc kubenswrapper[4707]: I1127 16:21:56.753850 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:21:56 crc kubenswrapper[4707]: I1127 16:21:56.754288 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerName="glance-log" containerID="cri-o://a2457030a9961af5854c83e5a20f52d3a0a20b2683fc3cd24f97b345bd69a877" gracePeriod=30 Nov 27 16:21:56 crc kubenswrapper[4707]: I1127 16:21:56.754538 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerName="glance-httpd" containerID="cri-o://ddb0e96230fc02b7f5dd232f2685f6135a1035250566123f6bef903bf2011d33" gracePeriod=30 Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.111302 4707 generic.go:334] "Generic (PLEG): container finished" podID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerID="a2457030a9961af5854c83e5a20f52d3a0a20b2683fc3cd24f97b345bd69a877" exitCode=143 Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.111404 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96ce5955-aed6-45e9-8a8a-4cb59d3a511d","Type":"ContainerDied","Data":"a2457030a9961af5854c83e5a20f52d3a0a20b2683fc3cd24f97b345bd69a877"} Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.446837 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5c4f76f9fb-ghh99" Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.462766 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-57bc8fcfc9-trdbf" Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.517618 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-d77877bbc-l8p9h"] Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.530432 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f6bc5c6bb-tzktc"] Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.969834 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.975757 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:57 crc kubenswrapper[4707]: I1127 16:21:57.982539 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.093139 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-combined-ca-bundle\") pod \"917b521a-96db-4475-bf1c-af43a99c67f1\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.093201 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data-custom\") pod \"917b521a-96db-4475-bf1c-af43a99c67f1\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.093245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn8dh\" (UniqueName: \"kubernetes.io/projected/917b521a-96db-4475-bf1c-af43a99c67f1-kube-api-access-nn8dh\") pod \"917b521a-96db-4475-bf1c-af43a99c67f1\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.093301 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txqd9\" (UniqueName: \"kubernetes.io/projected/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-kube-api-access-txqd9\") pod \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.093359 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data\") pod \"917b521a-96db-4475-bf1c-af43a99c67f1\" (UID: \"917b521a-96db-4475-bf1c-af43a99c67f1\") " Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.093452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-combined-ca-bundle\") pod \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.093469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data\") pod \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.093559 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data-custom\") pod \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\" (UID: \"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a\") " Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.098582 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/917b521a-96db-4475-bf1c-af43a99c67f1-kube-api-access-nn8dh" (OuterVolumeSpecName: "kube-api-access-nn8dh") pod "917b521a-96db-4475-bf1c-af43a99c67f1" (UID: "917b521a-96db-4475-bf1c-af43a99c67f1"). InnerVolumeSpecName "kube-api-access-nn8dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.098941 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" (UID: "8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.100228 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "917b521a-96db-4475-bf1c-af43a99c67f1" (UID: "917b521a-96db-4475-bf1c-af43a99c67f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.119070 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-kube-api-access-txqd9" (OuterVolumeSpecName: "kube-api-access-txqd9") pod "8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" (UID: "8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a"). InnerVolumeSpecName "kube-api-access-txqd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.138890 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "917b521a-96db-4475-bf1c-af43a99c67f1" (UID: "917b521a-96db-4475-bf1c-af43a99c67f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.140017 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" (UID: "8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.141260 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" event={"ID":"917b521a-96db-4475-bf1c-af43a99c67f1","Type":"ContainerDied","Data":"c0b9ede4336efb04aff27cc22b8fa62d74b676e2ae713b2b722cabd503846f6c"} Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.141319 4707 scope.go:117] "RemoveContainer" containerID="3779ec49760e720269224277194d444cdf1a0d2b42ee86a1b8c894c5dc8205dd" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.141511 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-d77877bbc-l8p9h" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.153270 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f6bc5c6bb-tzktc" event={"ID":"8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a","Type":"ContainerDied","Data":"3db1097466c184742fd8ec1ed22d8738be6460b0d568304e30ea83ba0933849e"} Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.153351 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f6bc5c6bb-tzktc" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.157727 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data" (OuterVolumeSpecName: "config-data") pod "917b521a-96db-4475-bf1c-af43a99c67f1" (UID: "917b521a-96db-4475-bf1c-af43a99c67f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.166565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data" (OuterVolumeSpecName: "config-data") pod "8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" (UID: "8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.195436 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.195473 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.195489 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.195503 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.195515 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.195525 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/917b521a-96db-4475-bf1c-af43a99c67f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.195537 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn8dh\" (UniqueName: \"kubernetes.io/projected/917b521a-96db-4475-bf1c-af43a99c67f1-kube-api-access-nn8dh\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.195549 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txqd9\" (UniqueName: \"kubernetes.io/projected/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a-kube-api-access-txqd9\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.249413 4707 scope.go:117] "RemoveContainer" containerID="f59b00ce961883b147cbc749460a6f9f27775445cd8d192484e6da565de13ce3" Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.505886 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-d77877bbc-l8p9h"] Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.522201 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-d77877bbc-l8p9h"] Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.531057 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f6bc5c6bb-tzktc"] Nov 27 16:21:58 crc kubenswrapper[4707]: I1127 16:21:58.539920 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5f6bc5c6bb-tzktc"] Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.172399 4707 generic.go:334] "Generic (PLEG): container finished" podID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerID="62bdd03ab195ad3612cdfb77283c1c12d9f5af412bb3d68830b00c6e0c23d85c" exitCode=0 Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.172484 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5798d94c-5634-4d92-b36d-cab9231adfc4","Type":"ContainerDied","Data":"62bdd03ab195ad3612cdfb77283c1c12d9f5af412bb3d68830b00c6e0c23d85c"} Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.224075 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" path="/var/lib/kubelet/pods/8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a/volumes" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.225495 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" path="/var/lib/kubelet/pods/917b521a-96db-4475-bf1c-af43a99c67f1/volumes" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.435194 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.531266 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-combined-ca-bundle\") pod \"5798d94c-5634-4d92-b36d-cab9231adfc4\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.531614 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"5798d94c-5634-4d92-b36d-cab9231adfc4\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.531665 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-logs\") pod \"5798d94c-5634-4d92-b36d-cab9231adfc4\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.531684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-scripts\") pod \"5798d94c-5634-4d92-b36d-cab9231adfc4\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.531775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-config-data\") pod \"5798d94c-5634-4d92-b36d-cab9231adfc4\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.531828 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhddt\" (UniqueName: \"kubernetes.io/projected/5798d94c-5634-4d92-b36d-cab9231adfc4-kube-api-access-rhddt\") pod \"5798d94c-5634-4d92-b36d-cab9231adfc4\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.531849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-public-tls-certs\") pod \"5798d94c-5634-4d92-b36d-cab9231adfc4\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.532194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-httpd-run\") pod \"5798d94c-5634-4d92-b36d-cab9231adfc4\" (UID: \"5798d94c-5634-4d92-b36d-cab9231adfc4\") " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.535423 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5798d94c-5634-4d92-b36d-cab9231adfc4" (UID: "5798d94c-5634-4d92-b36d-cab9231adfc4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.535523 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-logs" (OuterVolumeSpecName: "logs") pod "5798d94c-5634-4d92-b36d-cab9231adfc4" (UID: "5798d94c-5634-4d92-b36d-cab9231adfc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.538537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5798d94c-5634-4d92-b36d-cab9231adfc4-kube-api-access-rhddt" (OuterVolumeSpecName: "kube-api-access-rhddt") pod "5798d94c-5634-4d92-b36d-cab9231adfc4" (UID: "5798d94c-5634-4d92-b36d-cab9231adfc4"). InnerVolumeSpecName "kube-api-access-rhddt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.539361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-scripts" (OuterVolumeSpecName: "scripts") pod "5798d94c-5634-4d92-b36d-cab9231adfc4" (UID: "5798d94c-5634-4d92-b36d-cab9231adfc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.544789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "5798d94c-5634-4d92-b36d-cab9231adfc4" (UID: "5798d94c-5634-4d92-b36d-cab9231adfc4"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.573508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5798d94c-5634-4d92-b36d-cab9231adfc4" (UID: "5798d94c-5634-4d92-b36d-cab9231adfc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.599539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5798d94c-5634-4d92-b36d-cab9231adfc4" (UID: "5798d94c-5634-4d92-b36d-cab9231adfc4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.622155 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-config-data" (OuterVolumeSpecName: "config-data") pod "5798d94c-5634-4d92-b36d-cab9231adfc4" (UID: "5798d94c-5634-4d92-b36d-cab9231adfc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.634171 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.634218 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.634229 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.634239 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.634248 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.634256 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhddt\" (UniqueName: \"kubernetes.io/projected/5798d94c-5634-4d92-b36d-cab9231adfc4-kube-api-access-rhddt\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.634266 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5798d94c-5634-4d92-b36d-cab9231adfc4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.634273 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5798d94c-5634-4d92-b36d-cab9231adfc4-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.655098 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.695043 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hq68q"] Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.695668 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b3a42f-b2e8-46ae-b500-3b2de0b501c7" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.695745 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b3a42f-b2e8-46ae-b500-3b2de0b501c7" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.695809 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca24dad-8ffd-41b5-9379-b05c90193e9e" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.695858 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca24dad-8ffd-41b5-9379-b05c90193e9e" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.695915 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerName="glance-log" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.695962 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerName="glance-log" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.696034 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" containerName="heat-cfnapi" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.696089 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" containerName="heat-cfnapi" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.696148 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75326248-4957-4086-ad33-0a8c76ec7ff5" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.696196 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="75326248-4957-4086-ad33-0a8c76ec7ff5" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.696247 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581592db-9e10-4a98-a03d-598ce54b0c74" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.696293 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="581592db-9e10-4a98-a03d-598ce54b0c74" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.696343 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33b6312-639a-429c-88ae-5c60ec56280c" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.696415 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33b6312-639a-429c-88ae-5c60ec56280c" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.696472 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ce055f-5c4d-43ae-895c-4632afdacd87" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.696520 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ce055f-5c4d-43ae-895c-4632afdacd87" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.696571 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerName="glance-httpd" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.697197 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerName="glance-httpd" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.697267 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e50d632-6bfc-48aa-ab32-f0a05105b482" containerName="heat-cfnapi" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.697323 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e50d632-6bfc-48aa-ab32-f0a05105b482" containerName="heat-cfnapi" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.697405 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.697473 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.697528 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a68c0c-cdcb-4e30-bb21-051e9bdbf395" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.697576 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a68c0c-cdcb-4e30-bb21-051e9bdbf395" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: E1127 16:21:59.697633 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.697686 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.697909 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="75326248-4957-4086-ad33-0a8c76ec7ff5" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.697969 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.698021 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a68c0c-cdcb-4e30-bb21-051e9bdbf395" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.698072 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" containerName="heat-cfnapi" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.698123 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8314a6db-7b5b-4ad9-82d3-1e67ebbb9c1a" containerName="heat-api" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.698179 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33b6312-639a-429c-88ae-5c60ec56280c" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.698313 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerName="glance-log" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.698387 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" containerName="glance-httpd" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.699070 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ce055f-5c4d-43ae-895c-4632afdacd87" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.699160 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e50d632-6bfc-48aa-ab32-f0a05105b482" containerName="heat-cfnapi" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.699223 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="581592db-9e10-4a98-a03d-598ce54b0c74" containerName="mariadb-account-create-update" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.699279 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca24dad-8ffd-41b5-9379-b05c90193e9e" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.699333 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b3a42f-b2e8-46ae-b500-3b2de0b501c7" containerName="mariadb-database-create" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.700261 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.704862 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hq68q"] Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.705312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jx489" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.705492 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.705599 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.735619 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.837262 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-scripts\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.837367 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkq4h\" (UniqueName: \"kubernetes.io/projected/11d822ec-0bcb-4a76-a135-2a140d0618c8-kube-api-access-kkq4h\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.837616 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.837665 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-config-data\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.939593 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-scripts\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.939714 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkq4h\" (UniqueName: \"kubernetes.io/projected/11d822ec-0bcb-4a76-a135-2a140d0618c8-kube-api-access-kkq4h\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.939841 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.939864 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-config-data\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.942993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-scripts\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.944001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-config-data\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.944923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:21:59 crc kubenswrapper[4707]: I1127 16:21:59.962081 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkq4h\" (UniqueName: \"kubernetes.io/projected/11d822ec-0bcb-4a76-a135-2a140d0618c8-kube-api-access-kkq4h\") pod \"nova-cell0-conductor-db-sync-hq68q\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.045941 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.204248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5798d94c-5634-4d92-b36d-cab9231adfc4","Type":"ContainerDied","Data":"8df54de512afb59c274207d4ddd2f1c9bfcd66a123c7c4fed094458644fc75c2"} Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.204530 4707 scope.go:117] "RemoveContainer" containerID="62bdd03ab195ad3612cdfb77283c1c12d9f5af412bb3d68830b00c6e0c23d85c" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.204281 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.217401 4707 generic.go:334] "Generic (PLEG): container finished" podID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerID="ddb0e96230fc02b7f5dd232f2685f6135a1035250566123f6bef903bf2011d33" exitCode=0 Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.217450 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96ce5955-aed6-45e9-8a8a-4cb59d3a511d","Type":"ContainerDied","Data":"ddb0e96230fc02b7f5dd232f2685f6135a1035250566123f6bef903bf2011d33"} Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.234628 4707 scope.go:117] "RemoveContainer" containerID="81d2c1c2557632dd94e53405c847f196729c846144d2ec8a8ef1580865599241" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.250580 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.264359 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.277200 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:22:00 crc kubenswrapper[4707]: E1127 16:22:00.279863 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" containerName="heat-cfnapi" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.279890 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" containerName="heat-cfnapi" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.280071 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="917b521a-96db-4475-bf1c-af43a99c67f1" containerName="heat-cfnapi" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.280997 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.283110 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.283429 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.295474 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.449346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4ece211-479a-4f06-bc88-b9e50c0671f4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.449415 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg5fn\" (UniqueName: \"kubernetes.io/projected/f4ece211-479a-4f06-bc88-b9e50c0671f4-kube-api-access-cg5fn\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.449457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.449498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ece211-479a-4f06-bc88-b9e50c0671f4-logs\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.449514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.449562 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.449594 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.449646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551106 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551152 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4ece211-479a-4f06-bc88-b9e50c0671f4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551176 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg5fn\" (UniqueName: \"kubernetes.io/projected/f4ece211-479a-4f06-bc88-b9e50c0671f4-kube-api-access-cg5fn\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551208 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551241 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ece211-479a-4f06-bc88-b9e50c0671f4-logs\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551256 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551301 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551711 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4ece211-479a-4f06-bc88-b9e50c0671f4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.551936 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ece211-479a-4f06-bc88-b9e50c0671f4-logs\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.554770 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.557615 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.560246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.562184 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.569409 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ece211-479a-4f06-bc88-b9e50c0671f4-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.575987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg5fn\" (UniqueName: \"kubernetes.io/projected/f4ece211-479a-4f06-bc88-b9e50c0671f4-kube-api-access-cg5fn\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.584777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.590841 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f4ece211-479a-4f06-bc88-b9e50c0671f4\") " pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.598910 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.756983 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-scripts\") pod \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.757020 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.757090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-logs\") pod \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.757129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-httpd-run\") pod \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.757168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-config-data\") pod \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.757225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-844vh\" (UniqueName: \"kubernetes.io/projected/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-kube-api-access-844vh\") pod \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.757256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-combined-ca-bundle\") pod \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.757276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-internal-tls-certs\") pod \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\" (UID: \"96ce5955-aed6-45e9-8a8a-4cb59d3a511d\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.758235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "96ce5955-aed6-45e9-8a8a-4cb59d3a511d" (UID: "96ce5955-aed6-45e9-8a8a-4cb59d3a511d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.761832 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-scripts" (OuterVolumeSpecName: "scripts") pod "96ce5955-aed6-45e9-8a8a-4cb59d3a511d" (UID: "96ce5955-aed6-45e9-8a8a-4cb59d3a511d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.762710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-logs" (OuterVolumeSpecName: "logs") pod "96ce5955-aed6-45e9-8a8a-4cb59d3a511d" (UID: "96ce5955-aed6-45e9-8a8a-4cb59d3a511d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.771085 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hq68q"] Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.774550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "96ce5955-aed6-45e9-8a8a-4cb59d3a511d" (UID: "96ce5955-aed6-45e9-8a8a-4cb59d3a511d"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.774601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-kube-api-access-844vh" (OuterVolumeSpecName: "kube-api-access-844vh") pod "96ce5955-aed6-45e9-8a8a-4cb59d3a511d" (UID: "96ce5955-aed6-45e9-8a8a-4cb59d3a511d"). InnerVolumeSpecName "kube-api-access-844vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.815580 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.820461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96ce5955-aed6-45e9-8a8a-4cb59d3a511d" (UID: "96ce5955-aed6-45e9-8a8a-4cb59d3a511d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.840521 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "96ce5955-aed6-45e9-8a8a-4cb59d3a511d" (UID: "96ce5955-aed6-45e9-8a8a-4cb59d3a511d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.859759 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.859810 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.859821 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.859830 4707 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.859840 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-844vh\" (UniqueName: \"kubernetes.io/projected/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-kube-api-access-844vh\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.859850 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.859858 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.864956 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-config-data" (OuterVolumeSpecName: "config-data") pod "96ce5955-aed6-45e9-8a8a-4cb59d3a511d" (UID: "96ce5955-aed6-45e9-8a8a-4cb59d3a511d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.889328 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.960599 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-sg-core-conf-yaml\") pod \"ca422061-61e0-4b99-91ed-adb502839a46\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.960677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq89q\" (UniqueName: \"kubernetes.io/projected/ca422061-61e0-4b99-91ed-adb502839a46-kube-api-access-dq89q\") pod \"ca422061-61e0-4b99-91ed-adb502839a46\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.960719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-run-httpd\") pod \"ca422061-61e0-4b99-91ed-adb502839a46\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.960741 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-scripts\") pod \"ca422061-61e0-4b99-91ed-adb502839a46\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.960811 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-log-httpd\") pod \"ca422061-61e0-4b99-91ed-adb502839a46\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.960848 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-config-data\") pod \"ca422061-61e0-4b99-91ed-adb502839a46\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.960904 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-combined-ca-bundle\") pod \"ca422061-61e0-4b99-91ed-adb502839a46\" (UID: \"ca422061-61e0-4b99-91ed-adb502839a46\") " Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.961536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ca422061-61e0-4b99-91ed-adb502839a46" (UID: "ca422061-61e0-4b99-91ed-adb502839a46"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.961761 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ca422061-61e0-4b99-91ed-adb502839a46" (UID: "ca422061-61e0-4b99-91ed-adb502839a46"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.962264 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.962281 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.962290 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca422061-61e0-4b99-91ed-adb502839a46-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.962298 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ce5955-aed6-45e9-8a8a-4cb59d3a511d-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.967510 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca422061-61e0-4b99-91ed-adb502839a46-kube-api-access-dq89q" (OuterVolumeSpecName: "kube-api-access-dq89q") pod "ca422061-61e0-4b99-91ed-adb502839a46" (UID: "ca422061-61e0-4b99-91ed-adb502839a46"). InnerVolumeSpecName "kube-api-access-dq89q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.967525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-scripts" (OuterVolumeSpecName: "scripts") pod "ca422061-61e0-4b99-91ed-adb502839a46" (UID: "ca422061-61e0-4b99-91ed-adb502839a46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:00 crc kubenswrapper[4707]: I1127 16:22:00.996980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ca422061-61e0-4b99-91ed-adb502839a46" (UID: "ca422061-61e0-4b99-91ed-adb502839a46"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.064681 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.064706 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq89q\" (UniqueName: \"kubernetes.io/projected/ca422061-61e0-4b99-91ed-adb502839a46-kube-api-access-dq89q\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.064718 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.068721 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca422061-61e0-4b99-91ed-adb502839a46" (UID: "ca422061-61e0-4b99-91ed-adb502839a46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.090206 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-config-data" (OuterVolumeSpecName: "config-data") pod "ca422061-61e0-4b99-91ed-adb502839a46" (UID: "ca422061-61e0-4b99-91ed-adb502839a46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.168728 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.178887 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.179136 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca422061-61e0-4b99-91ed-adb502839a46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.223849 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5798d94c-5634-4d92-b36d-cab9231adfc4" path="/var/lib/kubelet/pods/5798d94c-5634-4d92-b36d-cab9231adfc4/volumes" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.235559 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hq68q" event={"ID":"11d822ec-0bcb-4a76-a135-2a140d0618c8","Type":"ContainerStarted","Data":"9460bcf743221eb0e283e9efee047d6996dfab14e0899de5e4467cb2f7642f57"} Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.258355 4707 generic.go:334] "Generic (PLEG): container finished" podID="ca422061-61e0-4b99-91ed-adb502839a46" containerID="5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb" exitCode=0 Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.258638 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerDied","Data":"5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb"} Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.258647 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.258667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca422061-61e0-4b99-91ed-adb502839a46","Type":"ContainerDied","Data":"2a3eee1986799b8409ffbeac2724585f05a7c6cef9156be8dc5a73b9afb7f8cf"} Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.258758 4707 scope.go:117] "RemoveContainer" containerID="e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.270068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"96ce5955-aed6-45e9-8a8a-4cb59d3a511d","Type":"ContainerDied","Data":"a9c5c5b6863ea3e3e0c3efd9b28f1d4bda63f7c1ea4fae2d6fa6be9fa25684e5"} Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.270152 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.271813 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4ece211-479a-4f06-bc88-b9e50c0671f4","Type":"ContainerStarted","Data":"4b5e6f6daea5a5b4da06f292f8ad18c8b9ce1ee5a9887b3daa1ddb49fe643bd2"} Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.288460 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.301340 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.315681 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.323755 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.324299 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerName="glance-httpd" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324321 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerName="glance-httpd" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.324337 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerName="glance-log" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324347 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerName="glance-log" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.324367 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="proxy-httpd" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324389 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="proxy-httpd" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.324418 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="ceilometer-notification-agent" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324428 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="ceilometer-notification-agent" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.324448 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="ceilometer-central-agent" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324455 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="ceilometer-central-agent" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.324466 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="sg-core" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324473 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="sg-core" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324671 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="ceilometer-central-agent" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324688 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="sg-core" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324700 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerName="glance-httpd" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324723 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="ceilometer-notification-agent" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324739 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" containerName="glance-log" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.324749 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca422061-61e0-4b99-91ed-adb502839a46" containerName="proxy-httpd" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.326717 4707 scope.go:117] "RemoveContainer" containerID="285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.326832 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.329335 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.329596 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.338725 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.356384 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.371378 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.373052 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.377986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.381894 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.388838 4707 scope.go:117] "RemoveContainer" containerID="e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.411082 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.419052 4707 scope.go:117] "RemoveContainer" containerID="5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.451603 4707 scope.go:117] "RemoveContainer" containerID="e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.452206 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb\": container with ID starting with e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb not found: ID does not exist" containerID="e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.452236 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb"} err="failed to get container status \"e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb\": rpc error: code = NotFound desc = could not find container \"e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb\": container with ID starting with e23728f27cef2753b5c7982ef74fe10b59be7f16d96d5d83e712c1842efaeceb not found: ID does not exist" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.452254 4707 scope.go:117] "RemoveContainer" containerID="285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.456990 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586\": container with ID starting with 285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586 not found: ID does not exist" containerID="285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.457014 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586"} err="failed to get container status \"285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586\": rpc error: code = NotFound desc = could not find container \"285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586\": container with ID starting with 285c89a886d69335891e6811cacb8bbd0ea29959a376355390ef8acfbeb10586 not found: ID does not exist" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.457028 4707 scope.go:117] "RemoveContainer" containerID="e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.457278 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d\": container with ID starting with e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d not found: ID does not exist" containerID="e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.457302 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d"} err="failed to get container status \"e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d\": rpc error: code = NotFound desc = could not find container \"e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d\": container with ID starting with e557ec3cc44b41de2c50306eb79513fee36c41d79181960ecae36b25ee171e6d not found: ID does not exist" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.457315 4707 scope.go:117] "RemoveContainer" containerID="5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb" Nov 27 16:22:01 crc kubenswrapper[4707]: E1127 16:22:01.457540 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb\": container with ID starting with 5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb not found: ID does not exist" containerID="5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.457560 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb"} err="failed to get container status \"5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb\": rpc error: code = NotFound desc = could not find container \"5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb\": container with ID starting with 5a943b848246e7a0e26df29e9e5872c0e4b1e43f0a4c8ae9352314222b2203bb not found: ID does not exist" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.457573 4707 scope.go:117] "RemoveContainer" containerID="ddb0e96230fc02b7f5dd232f2685f6135a1035250566123f6bef903bf2011d33" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486454 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgd5n\" (UniqueName: \"kubernetes.io/projected/1b04d427-3430-4875-9ba8-859ea0b820ad-kube-api-access-jgd5n\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486492 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486591 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-run-httpd\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486607 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486625 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486652 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24849992-202f-4439-ac0d-241724235be4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486674 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm8jq\" (UniqueName: \"kubernetes.io/projected/24849992-202f-4439-ac0d-241724235be4-kube-api-access-tm8jq\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486712 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-log-httpd\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486738 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-config-data\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486795 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.486966 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-scripts\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.487015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.487085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24849992-202f-4439-ac0d-241724235be4-logs\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.523596 4707 scope.go:117] "RemoveContainer" containerID="a2457030a9961af5854c83e5a20f52d3a0a20b2683fc3cd24f97b345bd69a877" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589697 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-log-httpd\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-config-data\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589780 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589815 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589848 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-scripts\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589870 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24849992-202f-4439-ac0d-241724235be4-logs\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgd5n\" (UniqueName: \"kubernetes.io/projected/1b04d427-3430-4875-9ba8-859ea0b820ad-kube-api-access-jgd5n\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589974 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.589997 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-run-httpd\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.590014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.590031 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.590057 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24849992-202f-4439-ac0d-241724235be4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.590077 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm8jq\" (UniqueName: \"kubernetes.io/projected/24849992-202f-4439-ac0d-241724235be4-kube-api-access-tm8jq\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.590269 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-log-httpd\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.590790 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.594738 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-scripts\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.595659 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.596047 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24849992-202f-4439-ac0d-241724235be4-logs\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.596262 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-run-httpd\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.597929 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24849992-202f-4439-ac0d-241724235be4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.598045 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.598284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.598321 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-config-data\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.601925 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.601948 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.602758 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24849992-202f-4439-ac0d-241724235be4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.605874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgd5n\" (UniqueName: \"kubernetes.io/projected/1b04d427-3430-4875-9ba8-859ea0b820ad-kube-api-access-jgd5n\") pod \"ceilometer-0\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.614138 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm8jq\" (UniqueName: \"kubernetes.io/projected/24849992-202f-4439-ac0d-241724235be4-kube-api-access-tm8jq\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.622712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"24849992-202f-4439-ac0d-241724235be4\") " pod="openstack/glance-default-internal-api-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.656438 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:01 crc kubenswrapper[4707]: I1127 16:22:01.695051 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:02 crc kubenswrapper[4707]: I1127 16:22:02.148662 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:02 crc kubenswrapper[4707]: I1127 16:22:02.292656 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerStarted","Data":"8c19a5cebf3dc38c44dbf6d00cc75a211c0690e90f68e45fefb60bc28bd04860"} Nov 27 16:22:02 crc kubenswrapper[4707]: I1127 16:22:02.294919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4ece211-479a-4f06-bc88-b9e50c0671f4","Type":"ContainerStarted","Data":"091163ca0415bb393d4e6ac4ea9548b1c9640bff5574a390aed93d698f536164"} Nov 27 16:22:02 crc kubenswrapper[4707]: I1127 16:22:02.333173 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 16:22:03 crc kubenswrapper[4707]: I1127 16:22:03.207658 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ce5955-aed6-45e9-8a8a-4cb59d3a511d" path="/var/lib/kubelet/pods/96ce5955-aed6-45e9-8a8a-4cb59d3a511d/volumes" Nov 27 16:22:03 crc kubenswrapper[4707]: I1127 16:22:03.209021 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca422061-61e0-4b99-91ed-adb502839a46" path="/var/lib/kubelet/pods/ca422061-61e0-4b99-91ed-adb502839a46/volumes" Nov 27 16:22:03 crc kubenswrapper[4707]: I1127 16:22:03.305449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerStarted","Data":"4ab47acaf3bf564824c619ceb7e5ef231b9cb38b5c5c0d098f26ed9e27ffacc3"} Nov 27 16:22:03 crc kubenswrapper[4707]: I1127 16:22:03.310170 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f4ece211-479a-4f06-bc88-b9e50c0671f4","Type":"ContainerStarted","Data":"849cb2165b0862987678b18a7c048adfc2938b00d6554767a2ab6c35495f4bed"} Nov 27 16:22:03 crc kubenswrapper[4707]: I1127 16:22:03.315958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24849992-202f-4439-ac0d-241724235be4","Type":"ContainerStarted","Data":"b43a41f3d540ee460f0aec7a34a10ab8cfeae22a234ef9b145793b6872eaae00"} Nov 27 16:22:03 crc kubenswrapper[4707]: I1127 16:22:03.315990 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24849992-202f-4439-ac0d-241724235be4","Type":"ContainerStarted","Data":"d64fe87f4b14a7cfa6b3739a13ae072addb138f542044fd60db3db46baada3b3"} Nov 27 16:22:04 crc kubenswrapper[4707]: I1127 16:22:04.327513 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerStarted","Data":"4ee81a19f1bfdf194cc1b3f636355dd46362f92fc02a5d961d4327980003b8ab"} Nov 27 16:22:04 crc kubenswrapper[4707]: I1127 16:22:04.330197 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24849992-202f-4439-ac0d-241724235be4","Type":"ContainerStarted","Data":"bf380ffbcd29ed4a6c64483ca624f5bd2e666a373974ed27af678668c0f4d513"} Nov 27 16:22:04 crc kubenswrapper[4707]: I1127 16:22:04.349888 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.3498741340000002 podStartE2EDuration="3.349874134s" podCreationTimestamp="2025-11-27 16:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:04.346437981 +0000 UTC m=+1099.977886749" watchObservedRunningTime="2025-11-27 16:22:04.349874134 +0000 UTC m=+1099.981322902" Nov 27 16:22:04 crc kubenswrapper[4707]: I1127 16:22:04.358345 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.358333227 podStartE2EDuration="4.358333227s" podCreationTimestamp="2025-11-27 16:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:03.335159269 +0000 UTC m=+1098.966608057" watchObservedRunningTime="2025-11-27 16:22:04.358333227 +0000 UTC m=+1099.989781995" Nov 27 16:22:04 crc kubenswrapper[4707]: I1127 16:22:04.437438 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7597fbc9fb-5l66n" Nov 27 16:22:04 crc kubenswrapper[4707]: I1127 16:22:04.544030 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7678b8b68b-mp4qf"] Nov 27 16:22:04 crc kubenswrapper[4707]: I1127 16:22:04.544234 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7678b8b68b-mp4qf" podUID="5e02884e-9f0d-45a5-a916-aa4018402ee4" containerName="heat-engine" containerID="cri-o://f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058" gracePeriod=60 Nov 27 16:22:05 crc kubenswrapper[4707]: I1127 16:22:05.342440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerStarted","Data":"21d8e8ec5b44e44701f27c4c99a16782f4f6ec9bf3b2151993581016bdaa83ba"} Nov 27 16:22:07 crc kubenswrapper[4707]: E1127 16:22:07.944255 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 27 16:22:07 crc kubenswrapper[4707]: E1127 16:22:07.946174 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 27 16:22:07 crc kubenswrapper[4707]: E1127 16:22:07.948519 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Nov 27 16:22:07 crc kubenswrapper[4707]: E1127 16:22:07.948582 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7678b8b68b-mp4qf" podUID="5e02884e-9f0d-45a5-a916-aa4018402ee4" containerName="heat-engine" Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.390150 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hq68q" event={"ID":"11d822ec-0bcb-4a76-a135-2a140d0618c8","Type":"ContainerStarted","Data":"7ee3574ba594912d92a16ed965dc1dc2ac99166f448892527b86ac7d840a5dab"} Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.413985 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerStarted","Data":"afea46a79746181e6cf61422509dd29eb29e49f31d8630ee5db82a5af9487ddd"} Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.414446 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.443420 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.632534743 podStartE2EDuration="9.443347977s" podCreationTimestamp="2025-11-27 16:22:01 +0000 UTC" firstStartedPulling="2025-11-27 16:22:02.154384482 +0000 UTC m=+1097.785833250" lastFinishedPulling="2025-11-27 16:22:09.965197676 +0000 UTC m=+1105.596646484" observedRunningTime="2025-11-27 16:22:10.434648116 +0000 UTC m=+1106.066096884" watchObservedRunningTime="2025-11-27 16:22:10.443347977 +0000 UTC m=+1106.074796745" Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.445145 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hq68q" podStartSLOduration=2.264258026 podStartE2EDuration="11.445136662s" podCreationTimestamp="2025-11-27 16:21:59 +0000 UTC" firstStartedPulling="2025-11-27 16:22:00.797271128 +0000 UTC m=+1096.428719896" lastFinishedPulling="2025-11-27 16:22:09.978149724 +0000 UTC m=+1105.609598532" observedRunningTime="2025-11-27 16:22:10.408702539 +0000 UTC m=+1106.040151307" watchObservedRunningTime="2025-11-27 16:22:10.445136662 +0000 UTC m=+1106.076585430" Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.599958 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.599998 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.631066 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 16:22:10 crc kubenswrapper[4707]: I1127 16:22:10.644915 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 16:22:11 crc kubenswrapper[4707]: I1127 16:22:11.433078 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 16:22:11 crc kubenswrapper[4707]: I1127 16:22:11.433457 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 16:22:11 crc kubenswrapper[4707]: I1127 16:22:11.696330 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:11 crc kubenswrapper[4707]: I1127 16:22:11.696385 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:11 crc kubenswrapper[4707]: I1127 16:22:11.733184 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:11 crc kubenswrapper[4707]: I1127 16:22:11.744792 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:12 crc kubenswrapper[4707]: I1127 16:22:12.441158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:12 crc kubenswrapper[4707]: I1127 16:22:12.441188 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.051302 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.107141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data-custom\") pod \"5e02884e-9f0d-45a5-a916-aa4018402ee4\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.107191 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4nkc\" (UniqueName: \"kubernetes.io/projected/5e02884e-9f0d-45a5-a916-aa4018402ee4-kube-api-access-b4nkc\") pod \"5e02884e-9f0d-45a5-a916-aa4018402ee4\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.107255 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-combined-ca-bundle\") pod \"5e02884e-9f0d-45a5-a916-aa4018402ee4\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.107379 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data\") pod \"5e02884e-9f0d-45a5-a916-aa4018402ee4\" (UID: \"5e02884e-9f0d-45a5-a916-aa4018402ee4\") " Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.114501 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e02884e-9f0d-45a5-a916-aa4018402ee4-kube-api-access-b4nkc" (OuterVolumeSpecName: "kube-api-access-b4nkc") pod "5e02884e-9f0d-45a5-a916-aa4018402ee4" (UID: "5e02884e-9f0d-45a5-a916-aa4018402ee4"). InnerVolumeSpecName "kube-api-access-b4nkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.119377 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5e02884e-9f0d-45a5-a916-aa4018402ee4" (UID: "5e02884e-9f0d-45a5-a916-aa4018402ee4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.153379 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e02884e-9f0d-45a5-a916-aa4018402ee4" (UID: "5e02884e-9f0d-45a5-a916-aa4018402ee4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.192685 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data" (OuterVolumeSpecName: "config-data") pod "5e02884e-9f0d-45a5-a916-aa4018402ee4" (UID: "5e02884e-9f0d-45a5-a916-aa4018402ee4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.219452 4707 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.219498 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4nkc\" (UniqueName: \"kubernetes.io/projected/5e02884e-9f0d-45a5-a916-aa4018402ee4-kube-api-access-b4nkc\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.219509 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.219519 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e02884e-9f0d-45a5-a916-aa4018402ee4-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.448734 4707 generic.go:334] "Generic (PLEG): container finished" podID="5e02884e-9f0d-45a5-a916-aa4018402ee4" containerID="f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058" exitCode=0 Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.448777 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7678b8b68b-mp4qf" event={"ID":"5e02884e-9f0d-45a5-a916-aa4018402ee4","Type":"ContainerDied","Data":"f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058"} Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.448819 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7678b8b68b-mp4qf" event={"ID":"5e02884e-9f0d-45a5-a916-aa4018402ee4","Type":"ContainerDied","Data":"a1993661c976ee955d8bf66b2db44fa30e6ae26d13ba828b35f644849cea1f58"} Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.448853 4707 scope.go:117] "RemoveContainer" containerID="f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.448873 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7678b8b68b-mp4qf" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.476073 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7678b8b68b-mp4qf"] Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.485038 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7678b8b68b-mp4qf"] Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.485491 4707 scope.go:117] "RemoveContainer" containerID="f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058" Nov 27 16:22:13 crc kubenswrapper[4707]: E1127 16:22:13.485882 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058\": container with ID starting with f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058 not found: ID does not exist" containerID="f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.485912 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058"} err="failed to get container status \"f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058\": rpc error: code = NotFound desc = could not find container \"f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058\": container with ID starting with f80dc97f5b8eb184d6aa30234230389910bc058931f380d766ce5f41dd831058 not found: ID does not exist" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.622456 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.622565 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:22:13 crc kubenswrapper[4707]: I1127 16:22:13.651886 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 16:22:14 crc kubenswrapper[4707]: I1127 16:22:14.394801 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:14 crc kubenswrapper[4707]: I1127 16:22:14.460799 4707 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 16:22:14 crc kubenswrapper[4707]: I1127 16:22:14.476333 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 16:22:15 crc kubenswrapper[4707]: I1127 16:22:15.209241 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e02884e-9f0d-45a5-a916-aa4018402ee4" path="/var/lib/kubelet/pods/5e02884e-9f0d-45a5-a916-aa4018402ee4/volumes" Nov 27 16:22:16 crc kubenswrapper[4707]: I1127 16:22:16.500288 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:16 crc kubenswrapper[4707]: I1127 16:22:16.500860 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="ceilometer-central-agent" containerID="cri-o://4ab47acaf3bf564824c619ceb7e5ef231b9cb38b5c5c0d098f26ed9e27ffacc3" gracePeriod=30 Nov 27 16:22:16 crc kubenswrapper[4707]: I1127 16:22:16.500927 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="sg-core" containerID="cri-o://21d8e8ec5b44e44701f27c4c99a16782f4f6ec9bf3b2151993581016bdaa83ba" gracePeriod=30 Nov 27 16:22:16 crc kubenswrapper[4707]: I1127 16:22:16.500958 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="ceilometer-notification-agent" containerID="cri-o://4ee81a19f1bfdf194cc1b3f636355dd46362f92fc02a5d961d4327980003b8ab" gracePeriod=30 Nov 27 16:22:16 crc kubenswrapper[4707]: I1127 16:22:16.500948 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="proxy-httpd" containerID="cri-o://afea46a79746181e6cf61422509dd29eb29e49f31d8630ee5db82a5af9487ddd" gracePeriod=30 Nov 27 16:22:17 crc kubenswrapper[4707]: I1127 16:22:17.506297 4707 generic.go:334] "Generic (PLEG): container finished" podID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerID="afea46a79746181e6cf61422509dd29eb29e49f31d8630ee5db82a5af9487ddd" exitCode=0 Nov 27 16:22:17 crc kubenswrapper[4707]: I1127 16:22:17.506637 4707 generic.go:334] "Generic (PLEG): container finished" podID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerID="21d8e8ec5b44e44701f27c4c99a16782f4f6ec9bf3b2151993581016bdaa83ba" exitCode=2 Nov 27 16:22:17 crc kubenswrapper[4707]: I1127 16:22:17.506650 4707 generic.go:334] "Generic (PLEG): container finished" podID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerID="4ab47acaf3bf564824c619ceb7e5ef231b9cb38b5c5c0d098f26ed9e27ffacc3" exitCode=0 Nov 27 16:22:17 crc kubenswrapper[4707]: I1127 16:22:17.506472 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerDied","Data":"afea46a79746181e6cf61422509dd29eb29e49f31d8630ee5db82a5af9487ddd"} Nov 27 16:22:17 crc kubenswrapper[4707]: I1127 16:22:17.506689 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerDied","Data":"21d8e8ec5b44e44701f27c4c99a16782f4f6ec9bf3b2151993581016bdaa83ba"} Nov 27 16:22:17 crc kubenswrapper[4707]: I1127 16:22:17.506708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerDied","Data":"4ab47acaf3bf564824c619ceb7e5ef231b9cb38b5c5c0d098f26ed9e27ffacc3"} Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.528894 4707 generic.go:334] "Generic (PLEG): container finished" podID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerID="4ee81a19f1bfdf194cc1b3f636355dd46362f92fc02a5d961d4327980003b8ab" exitCode=0 Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.528998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerDied","Data":"4ee81a19f1bfdf194cc1b3f636355dd46362f92fc02a5d961d4327980003b8ab"} Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.529307 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b04d427-3430-4875-9ba8-859ea0b820ad","Type":"ContainerDied","Data":"8c19a5cebf3dc38c44dbf6d00cc75a211c0690e90f68e45fefb60bc28bd04860"} Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.529329 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c19a5cebf3dc38c44dbf6d00cc75a211c0690e90f68e45fefb60bc28bd04860" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.531559 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.632864 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-run-httpd\") pod \"1b04d427-3430-4875-9ba8-859ea0b820ad\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.632934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-scripts\") pod \"1b04d427-3430-4875-9ba8-859ea0b820ad\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.632963 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgd5n\" (UniqueName: \"kubernetes.io/projected/1b04d427-3430-4875-9ba8-859ea0b820ad-kube-api-access-jgd5n\") pod \"1b04d427-3430-4875-9ba8-859ea0b820ad\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.633001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-config-data\") pod \"1b04d427-3430-4875-9ba8-859ea0b820ad\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.633048 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-combined-ca-bundle\") pod \"1b04d427-3430-4875-9ba8-859ea0b820ad\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.633106 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-log-httpd\") pod \"1b04d427-3430-4875-9ba8-859ea0b820ad\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.633172 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-sg-core-conf-yaml\") pod \"1b04d427-3430-4875-9ba8-859ea0b820ad\" (UID: \"1b04d427-3430-4875-9ba8-859ea0b820ad\") " Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.634600 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b04d427-3430-4875-9ba8-859ea0b820ad" (UID: "1b04d427-3430-4875-9ba8-859ea0b820ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.637789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b04d427-3430-4875-9ba8-859ea0b820ad" (UID: "1b04d427-3430-4875-9ba8-859ea0b820ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.664964 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-scripts" (OuterVolumeSpecName: "scripts") pod "1b04d427-3430-4875-9ba8-859ea0b820ad" (UID: "1b04d427-3430-4875-9ba8-859ea0b820ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.672531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b04d427-3430-4875-9ba8-859ea0b820ad-kube-api-access-jgd5n" (OuterVolumeSpecName: "kube-api-access-jgd5n") pod "1b04d427-3430-4875-9ba8-859ea0b820ad" (UID: "1b04d427-3430-4875-9ba8-859ea0b820ad"). InnerVolumeSpecName "kube-api-access-jgd5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.681740 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b04d427-3430-4875-9ba8-859ea0b820ad" (UID: "1b04d427-3430-4875-9ba8-859ea0b820ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.728509 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b04d427-3430-4875-9ba8-859ea0b820ad" (UID: "1b04d427-3430-4875-9ba8-859ea0b820ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.742568 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.742600 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.742609 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.742618 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgd5n\" (UniqueName: \"kubernetes.io/projected/1b04d427-3430-4875-9ba8-859ea0b820ad-kube-api-access-jgd5n\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.742629 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.742637 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b04d427-3430-4875-9ba8-859ea0b820ad-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.774567 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-config-data" (OuterVolumeSpecName: "config-data") pod "1b04d427-3430-4875-9ba8-859ea0b820ad" (UID: "1b04d427-3430-4875-9ba8-859ea0b820ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:19 crc kubenswrapper[4707]: I1127 16:22:19.843659 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b04d427-3430-4875-9ba8-859ea0b820ad-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.542357 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.594589 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.605807 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.639524 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:20 crc kubenswrapper[4707]: E1127 16:22:20.639994 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="ceilometer-notification-agent" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640016 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="ceilometer-notification-agent" Nov 27 16:22:20 crc kubenswrapper[4707]: E1127 16:22:20.640027 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="proxy-httpd" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640037 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="proxy-httpd" Nov 27 16:22:20 crc kubenswrapper[4707]: E1127 16:22:20.640059 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e02884e-9f0d-45a5-a916-aa4018402ee4" containerName="heat-engine" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e02884e-9f0d-45a5-a916-aa4018402ee4" containerName="heat-engine" Nov 27 16:22:20 crc kubenswrapper[4707]: E1127 16:22:20.640139 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="ceilometer-central-agent" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640148 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="ceilometer-central-agent" Nov 27 16:22:20 crc kubenswrapper[4707]: E1127 16:22:20.640170 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="sg-core" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640178 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="sg-core" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640415 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e02884e-9f0d-45a5-a916-aa4018402ee4" containerName="heat-engine" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640445 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="ceilometer-central-agent" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640464 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="sg-core" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640480 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="proxy-httpd" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.640499 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" containerName="ceilometer-notification-agent" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.643199 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.645312 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.645747 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.661230 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.761619 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-scripts\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.761700 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.761756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsfcn\" (UniqueName: \"kubernetes.io/projected/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-kube-api-access-qsfcn\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.761819 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-run-httpd\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.761860 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-log-httpd\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.761897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.762181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-config-data\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.864750 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.864956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsfcn\" (UniqueName: \"kubernetes.io/projected/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-kube-api-access-qsfcn\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.865030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-run-httpd\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.865080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-log-httpd\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.865131 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.865238 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-config-data\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.865305 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-scripts\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.866193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-log-httpd\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.866282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-run-httpd\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.870128 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.870771 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-config-data\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.871614 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.872001 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-scripts\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.896453 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsfcn\" (UniqueName: \"kubernetes.io/projected/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-kube-api-access-qsfcn\") pod \"ceilometer-0\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " pod="openstack/ceilometer-0" Nov 27 16:22:20 crc kubenswrapper[4707]: I1127 16:22:20.978313 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:21 crc kubenswrapper[4707]: I1127 16:22:21.231589 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b04d427-3430-4875-9ba8-859ea0b820ad" path="/var/lib/kubelet/pods/1b04d427-3430-4875-9ba8-859ea0b820ad/volumes" Nov 27 16:22:21 crc kubenswrapper[4707]: I1127 16:22:21.606694 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:22 crc kubenswrapper[4707]: I1127 16:22:22.560219 4707 generic.go:334] "Generic (PLEG): container finished" podID="11d822ec-0bcb-4a76-a135-2a140d0618c8" containerID="7ee3574ba594912d92a16ed965dc1dc2ac99166f448892527b86ac7d840a5dab" exitCode=0 Nov 27 16:22:22 crc kubenswrapper[4707]: I1127 16:22:22.560338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hq68q" event={"ID":"11d822ec-0bcb-4a76-a135-2a140d0618c8","Type":"ContainerDied","Data":"7ee3574ba594912d92a16ed965dc1dc2ac99166f448892527b86ac7d840a5dab"} Nov 27 16:22:22 crc kubenswrapper[4707]: I1127 16:22:22.562764 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerStarted","Data":"b5aa0adfaeb0ef8de3a51e09e850ef3abd88bbc443f4ec114f5250886a209ef6"} Nov 27 16:22:23 crc kubenswrapper[4707]: I1127 16:22:23.587102 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerStarted","Data":"084cc809119de06cc4756181a129e012ab9a699af74b7760e9de52913ee9d79a"} Nov 27 16:22:23 crc kubenswrapper[4707]: I1127 16:22:23.587514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerStarted","Data":"5fc9393f0398128968478a010f734a44a7d80a0a41530387e666885746655785"} Nov 27 16:22:23 crc kubenswrapper[4707]: I1127 16:22:23.950238 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.029443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkq4h\" (UniqueName: \"kubernetes.io/projected/11d822ec-0bcb-4a76-a135-2a140d0618c8-kube-api-access-kkq4h\") pod \"11d822ec-0bcb-4a76-a135-2a140d0618c8\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.029704 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-scripts\") pod \"11d822ec-0bcb-4a76-a135-2a140d0618c8\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.029724 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-combined-ca-bundle\") pod \"11d822ec-0bcb-4a76-a135-2a140d0618c8\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.029854 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-config-data\") pod \"11d822ec-0bcb-4a76-a135-2a140d0618c8\" (UID: \"11d822ec-0bcb-4a76-a135-2a140d0618c8\") " Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.037680 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d822ec-0bcb-4a76-a135-2a140d0618c8-kube-api-access-kkq4h" (OuterVolumeSpecName: "kube-api-access-kkq4h") pod "11d822ec-0bcb-4a76-a135-2a140d0618c8" (UID: "11d822ec-0bcb-4a76-a135-2a140d0618c8"). InnerVolumeSpecName "kube-api-access-kkq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.039200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-scripts" (OuterVolumeSpecName: "scripts") pod "11d822ec-0bcb-4a76-a135-2a140d0618c8" (UID: "11d822ec-0bcb-4a76-a135-2a140d0618c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.056536 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11d822ec-0bcb-4a76-a135-2a140d0618c8" (UID: "11d822ec-0bcb-4a76-a135-2a140d0618c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.066796 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-config-data" (OuterVolumeSpecName: "config-data") pod "11d822ec-0bcb-4a76-a135-2a140d0618c8" (UID: "11d822ec-0bcb-4a76-a135-2a140d0618c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.131473 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.131510 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkq4h\" (UniqueName: \"kubernetes.io/projected/11d822ec-0bcb-4a76-a135-2a140d0618c8-kube-api-access-kkq4h\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.131525 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.131538 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d822ec-0bcb-4a76-a135-2a140d0618c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.608998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerStarted","Data":"c5a304ab8003e81706e7b2bc635b4f2adaa9b26594b89f6545d1197761735b7a"} Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.613861 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hq68q" event={"ID":"11d822ec-0bcb-4a76-a135-2a140d0618c8","Type":"ContainerDied","Data":"9460bcf743221eb0e283e9efee047d6996dfab14e0899de5e4467cb2f7642f57"} Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.613909 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9460bcf743221eb0e283e9efee047d6996dfab14e0899de5e4467cb2f7642f57" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.613985 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hq68q" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.690343 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 16:22:24 crc kubenswrapper[4707]: E1127 16:22:24.690811 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11d822ec-0bcb-4a76-a135-2a140d0618c8" containerName="nova-cell0-conductor-db-sync" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.690831 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d822ec-0bcb-4a76-a135-2a140d0618c8" containerName="nova-cell0-conductor-db-sync" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.691117 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="11d822ec-0bcb-4a76-a135-2a140d0618c8" containerName="nova-cell0-conductor-db-sync" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.691902 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.694691 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.695057 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-jx489" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.709641 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.739942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.740031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.740065 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7jpb\" (UniqueName: \"kubernetes.io/projected/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-kube-api-access-z7jpb\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.842696 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.842890 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.843790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7jpb\" (UniqueName: \"kubernetes.io/projected/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-kube-api-access-z7jpb\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.849281 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.853160 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:24 crc kubenswrapper[4707]: I1127 16:22:24.859797 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7jpb\" (UniqueName: \"kubernetes.io/projected/2d07ee1e-4a93-4fe6-909d-2cc2a11993c7-kube-api-access-z7jpb\") pod \"nova-cell0-conductor-0\" (UID: \"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7\") " pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:25 crc kubenswrapper[4707]: I1127 16:22:25.011092 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:25 crc kubenswrapper[4707]: W1127 16:22:25.588630 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d07ee1e_4a93_4fe6_909d_2cc2a11993c7.slice/crio-d4a059f73430975c8662aef821c7cb7c57bd87c3644a7a52663ab4e608b20e56 WatchSource:0}: Error finding container d4a059f73430975c8662aef821c7cb7c57bd87c3644a7a52663ab4e608b20e56: Status 404 returned error can't find the container with id d4a059f73430975c8662aef821c7cb7c57bd87c3644a7a52663ab4e608b20e56 Nov 27 16:22:25 crc kubenswrapper[4707]: I1127 16:22:25.594667 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 16:22:25 crc kubenswrapper[4707]: I1127 16:22:25.626063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerStarted","Data":"54b0d26da6d3938b0707501bba757eec138a57173cbf49ddd52608443a2b6d6b"} Nov 27 16:22:25 crc kubenswrapper[4707]: I1127 16:22:25.626927 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 16:22:25 crc kubenswrapper[4707]: I1127 16:22:25.628298 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7","Type":"ContainerStarted","Data":"d4a059f73430975c8662aef821c7cb7c57bd87c3644a7a52663ab4e608b20e56"} Nov 27 16:22:26 crc kubenswrapper[4707]: I1127 16:22:26.642964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2d07ee1e-4a93-4fe6-909d-2cc2a11993c7","Type":"ContainerStarted","Data":"e700f7c6f5c202c227d757e27344ddb167eeee4a0778d1fb07e15a5d32d8e0c6"} Nov 27 16:22:26 crc kubenswrapper[4707]: I1127 16:22:26.670843 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.942627025 podStartE2EDuration="6.670820794s" podCreationTimestamp="2025-11-27 16:22:20 +0000 UTC" firstStartedPulling="2025-11-27 16:22:21.620002594 +0000 UTC m=+1117.251451362" lastFinishedPulling="2025-11-27 16:22:25.348196353 +0000 UTC m=+1120.979645131" observedRunningTime="2025-11-27 16:22:25.649408973 +0000 UTC m=+1121.280857771" watchObservedRunningTime="2025-11-27 16:22:26.670820794 +0000 UTC m=+1122.302269602" Nov 27 16:22:26 crc kubenswrapper[4707]: I1127 16:22:26.672560 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.672549637 podStartE2EDuration="2.672549637s" podCreationTimestamp="2025-11-27 16:22:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:26.666692649 +0000 UTC m=+1122.298141457" watchObservedRunningTime="2025-11-27 16:22:26.672549637 +0000 UTC m=+1122.303998445" Nov 27 16:22:27 crc kubenswrapper[4707]: I1127 16:22:27.654679 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.060904 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.624814 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-krw8h"] Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.630461 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.638879 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.639870 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.672452 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-krw8h"] Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.712402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.712459 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-582z9\" (UniqueName: \"kubernetes.io/projected/831d6788-637b-4a22-8ed9-1f39e8e277a0-kube-api-access-582z9\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.712529 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-config-data\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.712612 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-scripts\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.800627 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.804420 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.810756 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.814606 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.815570 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4jk\" (UniqueName: \"kubernetes.io/projected/5ee13539-219f-4772-9569-f8526eff8cda-kube-api-access-4l4jk\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.815630 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.815654 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-582z9\" (UniqueName: \"kubernetes.io/projected/831d6788-637b-4a22-8ed9-1f39e8e277a0-kube-api-access-582z9\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.815716 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-config-data\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.815774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-scripts\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.815798 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-config-data\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.815848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee13539-219f-4772-9569-f8526eff8cda-logs\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.815870 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.836595 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-scripts\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.836845 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.838499 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.840019 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-config-data\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.845040 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.866414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.866667 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-582z9\" (UniqueName: \"kubernetes.io/projected/831d6788-637b-4a22-8ed9-1f39e8e277a0-kube-api-access-582z9\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.867000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-krw8h\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.908910 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.910138 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917666 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf72ae8-b918-4663-9cab-dafb3949d87b-logs\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917706 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99nrp\" (UniqueName: \"kubernetes.io/projected/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-kube-api-access-99nrp\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917730 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nlnk\" (UniqueName: \"kubernetes.io/projected/eaf72ae8-b918-4663-9cab-dafb3949d87b-kube-api-access-9nlnk\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-config-data\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917820 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-config-data\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-config-data\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917906 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917929 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee13539-219f-4772-9569-f8526eff8cda-logs\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917952 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917969 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4jk\" (UniqueName: \"kubernetes.io/projected/5ee13539-219f-4772-9569-f8526eff8cda-kube-api-access-4l4jk\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.917994 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.920390 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee13539-219f-4772-9569-f8526eff8cda-logs\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.924964 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.933277 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.950091 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.950772 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-config-data\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.970856 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:30 crc kubenswrapper[4707]: I1127 16:22:30.979911 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4jk\" (UniqueName: \"kubernetes.io/projected/5ee13539-219f-4772-9569-f8526eff8cda-kube-api-access-4l4jk\") pod \"nova-api-0\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " pod="openstack/nova-api-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.021307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.022363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-config-data\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.036708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.036818 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.036883 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf72ae8-b918-4663-9cab-dafb3949d87b-logs\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.036931 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99nrp\" (UniqueName: \"kubernetes.io/projected/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-kube-api-access-99nrp\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.036963 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nlnk\" (UniqueName: \"kubernetes.io/projected/eaf72ae8-b918-4663-9cab-dafb3949d87b-kube-api-access-9nlnk\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.036996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-config-data\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.047909 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf72ae8-b918-4663-9cab-dafb3949d87b-logs\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.048175 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-config-data\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.061144 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-config-data\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.079871 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.080894 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.097119 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.100008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.110205 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99nrp\" (UniqueName: \"kubernetes.io/projected/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-kube-api-access-99nrp\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.110621 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nlnk\" (UniqueName: \"kubernetes.io/projected/eaf72ae8-b918-4663-9cab-dafb3949d87b-kube-api-access-9nlnk\") pod \"nova-metadata-0\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.111227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.116933 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-rs6pv"] Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.118333 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.137721 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.137757 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hbvd\" (UniqueName: \"kubernetes.io/projected/34748232-abe0-4c63-9fe6-68118b7fed04-kube-api-access-7hbvd\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.137778 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-config\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.137852 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.137876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmwmj\" (UniqueName: \"kubernetes.io/projected/eb68c751-abf5-49ac-985b-e0e9c3a7493f-kube-api-access-xmwmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.137922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-svc\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.137960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.137981 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.138024 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.163270 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.237950 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-rs6pv"] Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.238812 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.238849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.238893 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.238920 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.238942 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hbvd\" (UniqueName: \"kubernetes.io/projected/34748232-abe0-4c63-9fe6-68118b7fed04-kube-api-access-7hbvd\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.238960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-config\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.239026 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.239047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmwmj\" (UniqueName: \"kubernetes.io/projected/eb68c751-abf5-49ac-985b-e0e9c3a7493f-kube-api-access-xmwmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.239080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-svc\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.239844 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-svc\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.240355 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-config\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.242233 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.242731 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.244555 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.254926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.256831 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.270303 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hbvd\" (UniqueName: \"kubernetes.io/projected/34748232-abe0-4c63-9fe6-68118b7fed04-kube-api-access-7hbvd\") pod \"dnsmasq-dns-9b86998b5-rs6pv\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.284008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmwmj\" (UniqueName: \"kubernetes.io/projected/eb68c751-abf5-49ac-985b-e0e9c3a7493f-kube-api-access-xmwmj\") pod \"nova-cell1-novncproxy-0\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.372833 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.398484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.458057 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.487922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:31 crc kubenswrapper[4707]: W1127 16:22:31.761863 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod831d6788_637b_4a22_8ed9_1f39e8e277a0.slice/crio-7fc7ac91f8d2acb8b2b6a1b58dd7c154bfc33805a5b1ff4e900d3c24b84fa7b3 WatchSource:0}: Error finding container 7fc7ac91f8d2acb8b2b6a1b58dd7c154bfc33805a5b1ff4e900d3c24b84fa7b3: Status 404 returned error can't find the container with id 7fc7ac91f8d2acb8b2b6a1b58dd7c154bfc33805a5b1ff4e900d3c24b84fa7b3 Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.768041 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.782556 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-krw8h"] Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.926197 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb5wv"] Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.927746 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.932878 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.933077 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.943615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb5wv"] Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.961516 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:31 crc kubenswrapper[4707]: W1127 16:22:31.965319 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc280ae8f_cb0b_4035_a7a2_fedfb044a6f0.slice/crio-54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566 WatchSource:0}: Error finding container 54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566: Status 404 returned error can't find the container with id 54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566 Nov 27 16:22:31 crc kubenswrapper[4707]: I1127 16:22:31.970239 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:32 crc kubenswrapper[4707]: W1127 16:22:32.065524 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb68c751_abf5_49ac_985b_e0e9c3a7493f.slice/crio-8a749f1aee0961603663152a93d69884067112ada39ab9fc4d50c8c071cf3cfe WatchSource:0}: Error finding container 8a749f1aee0961603663152a93d69884067112ada39ab9fc4d50c8c071cf3cfe: Status 404 returned error can't find the container with id 8a749f1aee0961603663152a93d69884067112ada39ab9fc4d50c8c071cf3cfe Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.065986 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.085230 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-scripts\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.086786 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j672n\" (UniqueName: \"kubernetes.io/projected/e14300b4-67fd-4da5-8741-879007e38268-kube-api-access-j672n\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.087118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-config-data\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.089104 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: W1127 16:22:32.186840 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34748232_abe0_4c63_9fe6_68118b7fed04.slice/crio-ad9c371673e96a18190933bc27f03950abea6847d7066cc5ab802b04581e8256 WatchSource:0}: Error finding container ad9c371673e96a18190933bc27f03950abea6847d7066cc5ab802b04581e8256: Status 404 returned error can't find the container with id ad9c371673e96a18190933bc27f03950abea6847d7066cc5ab802b04581e8256 Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.193287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-scripts\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.193354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j672n\" (UniqueName: \"kubernetes.io/projected/e14300b4-67fd-4da5-8741-879007e38268-kube-api-access-j672n\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.193401 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-config-data\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.193435 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.196396 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-rs6pv"] Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.197422 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-scripts\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.197575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-config-data\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.198626 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.208052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j672n\" (UniqueName: \"kubernetes.io/projected/e14300b4-67fd-4da5-8741-879007e38268-kube-api-access-j672n\") pod \"nova-cell1-conductor-db-sync-xb5wv\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.352668 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.754741 4707 generic.go:334] "Generic (PLEG): container finished" podID="34748232-abe0-4c63-9fe6-68118b7fed04" containerID="2d506a98aef89c7aa4420965fcb0a6314d9b8710484a1f60f42596ee04be0983" exitCode=0 Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.755060 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" event={"ID":"34748232-abe0-4c63-9fe6-68118b7fed04","Type":"ContainerDied","Data":"2d506a98aef89c7aa4420965fcb0a6314d9b8710484a1f60f42596ee04be0983"} Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.755086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" event={"ID":"34748232-abe0-4c63-9fe6-68118b7fed04","Type":"ContainerStarted","Data":"ad9c371673e96a18190933bc27f03950abea6847d7066cc5ab802b04581e8256"} Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.760745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ee13539-219f-4772-9569-f8526eff8cda","Type":"ContainerStarted","Data":"b316993f7deb2fd7dcfb3d4589cc9124291bdc0eaefd0191e40272ab2268c87d"} Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.762715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0","Type":"ContainerStarted","Data":"54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566"} Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.764271 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-krw8h" event={"ID":"831d6788-637b-4a22-8ed9-1f39e8e277a0","Type":"ContainerStarted","Data":"ae09cce4caea83674a979e764de79df6336a8740b86cf3c0607b4da1bd0031ff"} Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.764297 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-krw8h" event={"ID":"831d6788-637b-4a22-8ed9-1f39e8e277a0","Type":"ContainerStarted","Data":"7fc7ac91f8d2acb8b2b6a1b58dd7c154bfc33805a5b1ff4e900d3c24b84fa7b3"} Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.767580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eb68c751-abf5-49ac-985b-e0e9c3a7493f","Type":"ContainerStarted","Data":"8a749f1aee0961603663152a93d69884067112ada39ab9fc4d50c8c071cf3cfe"} Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.768429 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eaf72ae8-b918-4663-9cab-dafb3949d87b","Type":"ContainerStarted","Data":"807e2f387b77d2114c48d83292a35e060c0b5df73050cbccad8fcfd93e9c13c1"} Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.796149 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-krw8h" podStartSLOduration=2.796126627 podStartE2EDuration="2.796126627s" podCreationTimestamp="2025-11-27 16:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:32.791924221 +0000 UTC m=+1128.423373009" watchObservedRunningTime="2025-11-27 16:22:32.796126627 +0000 UTC m=+1128.427575395" Nov 27 16:22:32 crc kubenswrapper[4707]: I1127 16:22:32.843037 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb5wv"] Nov 27 16:22:33 crc kubenswrapper[4707]: I1127 16:22:33.624221 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:22:33 crc kubenswrapper[4707]: I1127 16:22:33.624511 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:22:33 crc kubenswrapper[4707]: I1127 16:22:33.781583 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" event={"ID":"e14300b4-67fd-4da5-8741-879007e38268","Type":"ContainerStarted","Data":"25ab784e1dfdc9d441fbd5bd3464d8122fb0d1c63df76173637c82adef4308e9"} Nov 27 16:22:33 crc kubenswrapper[4707]: I1127 16:22:33.781629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" event={"ID":"e14300b4-67fd-4da5-8741-879007e38268","Type":"ContainerStarted","Data":"3cdb551e879a36942e67bf05577b637f8a1218e5bfde5fec58a1e94e40765f99"} Nov 27 16:22:33 crc kubenswrapper[4707]: I1127 16:22:33.785004 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" event={"ID":"34748232-abe0-4c63-9fe6-68118b7fed04","Type":"ContainerStarted","Data":"850ffd2ac4f6fc750a1234dc25593a820b170122351d923e194369828bba3f32"} Nov 27 16:22:33 crc kubenswrapper[4707]: I1127 16:22:33.785340 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:33 crc kubenswrapper[4707]: I1127 16:22:33.803825 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" podStartSLOduration=2.80380837 podStartE2EDuration="2.80380837s" podCreationTimestamp="2025-11-27 16:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:33.803241586 +0000 UTC m=+1129.434690344" watchObservedRunningTime="2025-11-27 16:22:33.80380837 +0000 UTC m=+1129.435257138" Nov 27 16:22:33 crc kubenswrapper[4707]: I1127 16:22:33.829759 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" podStartSLOduration=2.829743547 podStartE2EDuration="2.829743547s" podCreationTimestamp="2025-11-27 16:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:33.823396096 +0000 UTC m=+1129.454844864" watchObservedRunningTime="2025-11-27 16:22:33.829743547 +0000 UTC m=+1129.461192315" Nov 27 16:22:34 crc kubenswrapper[4707]: I1127 16:22:34.537808 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:34 crc kubenswrapper[4707]: I1127 16:22:34.606499 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.812619 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ee13539-219f-4772-9569-f8526eff8cda","Type":"ContainerStarted","Data":"b203f67cd1cbf3949f13ad1cbceb6aaa0cd33cca5ba912d48e685e6d831542d0"} Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.812934 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ee13539-219f-4772-9569-f8526eff8cda","Type":"ContainerStarted","Data":"eeaf1d3be99ba07fc12759afdd6dc44013ba560cc093c6ca1634b8dd4174ed1e"} Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.817594 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0","Type":"ContainerStarted","Data":"2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64"} Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.824882 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eb68c751-abf5-49ac-985b-e0e9c3a7493f","Type":"ContainerStarted","Data":"39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445"} Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.825068 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="eb68c751-abf5-49ac-985b-e0e9c3a7493f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445" gracePeriod=30 Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.844665 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eaf72ae8-b918-4663-9cab-dafb3949d87b","Type":"ContainerStarted","Data":"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab"} Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.844710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eaf72ae8-b918-4663-9cab-dafb3949d87b","Type":"ContainerStarted","Data":"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d"} Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.844843 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerName="nova-metadata-log" containerID="cri-o://e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d" gracePeriod=30 Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.844893 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerName="nova-metadata-metadata" containerID="cri-o://2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab" gracePeriod=30 Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.857418 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.802433067 podStartE2EDuration="5.857397664s" podCreationTimestamp="2025-11-27 16:22:30 +0000 UTC" firstStartedPulling="2025-11-27 16:22:31.759504612 +0000 UTC m=+1127.390953370" lastFinishedPulling="2025-11-27 16:22:34.814469199 +0000 UTC m=+1130.445917967" observedRunningTime="2025-11-27 16:22:35.835331125 +0000 UTC m=+1131.466779913" watchObservedRunningTime="2025-11-27 16:22:35.857397664 +0000 UTC m=+1131.488846472" Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.859951 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.108880138 podStartE2EDuration="4.859939928s" podCreationTimestamp="2025-11-27 16:22:31 +0000 UTC" firstStartedPulling="2025-11-27 16:22:32.072848328 +0000 UTC m=+1127.704297096" lastFinishedPulling="2025-11-27 16:22:34.823908118 +0000 UTC m=+1130.455356886" observedRunningTime="2025-11-27 16:22:35.858478971 +0000 UTC m=+1131.489927759" watchObservedRunningTime="2025-11-27 16:22:35.859939928 +0000 UTC m=+1131.491388726" Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.889885 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.045666917 podStartE2EDuration="5.889866446s" podCreationTimestamp="2025-11-27 16:22:30 +0000 UTC" firstStartedPulling="2025-11-27 16:22:31.970222309 +0000 UTC m=+1127.601671077" lastFinishedPulling="2025-11-27 16:22:34.814421848 +0000 UTC m=+1130.445870606" observedRunningTime="2025-11-27 16:22:35.875455811 +0000 UTC m=+1131.506904579" watchObservedRunningTime="2025-11-27 16:22:35.889866446 +0000 UTC m=+1131.521315214" Nov 27 16:22:35 crc kubenswrapper[4707]: I1127 16:22:35.904223 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.048064928 podStartE2EDuration="5.904205629s" podCreationTimestamp="2025-11-27 16:22:30 +0000 UTC" firstStartedPulling="2025-11-27 16:22:31.958285247 +0000 UTC m=+1127.589734015" lastFinishedPulling="2025-11-27 16:22:34.814425948 +0000 UTC m=+1130.445874716" observedRunningTime="2025-11-27 16:22:35.895124459 +0000 UTC m=+1131.526573227" watchObservedRunningTime="2025-11-27 16:22:35.904205629 +0000 UTC m=+1131.535654397" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.373489 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.373529 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.398797 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.405593 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.459079 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.495576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-config-data\") pod \"eaf72ae8-b918-4663-9cab-dafb3949d87b\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.496090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-combined-ca-bundle\") pod \"eaf72ae8-b918-4663-9cab-dafb3949d87b\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.496587 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nlnk\" (UniqueName: \"kubernetes.io/projected/eaf72ae8-b918-4663-9cab-dafb3949d87b-kube-api-access-9nlnk\") pod \"eaf72ae8-b918-4663-9cab-dafb3949d87b\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.498303 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf72ae8-b918-4663-9cab-dafb3949d87b-logs\") pod \"eaf72ae8-b918-4663-9cab-dafb3949d87b\" (UID: \"eaf72ae8-b918-4663-9cab-dafb3949d87b\") " Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.498622 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf72ae8-b918-4663-9cab-dafb3949d87b-logs" (OuterVolumeSpecName: "logs") pod "eaf72ae8-b918-4663-9cab-dafb3949d87b" (UID: "eaf72ae8-b918-4663-9cab-dafb3949d87b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.500132 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf72ae8-b918-4663-9cab-dafb3949d87b-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.501179 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf72ae8-b918-4663-9cab-dafb3949d87b-kube-api-access-9nlnk" (OuterVolumeSpecName: "kube-api-access-9nlnk") pod "eaf72ae8-b918-4663-9cab-dafb3949d87b" (UID: "eaf72ae8-b918-4663-9cab-dafb3949d87b"). InnerVolumeSpecName "kube-api-access-9nlnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.525937 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-config-data" (OuterVolumeSpecName: "config-data") pod "eaf72ae8-b918-4663-9cab-dafb3949d87b" (UID: "eaf72ae8-b918-4663-9cab-dafb3949d87b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.533379 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaf72ae8-b918-4663-9cab-dafb3949d87b" (UID: "eaf72ae8-b918-4663-9cab-dafb3949d87b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.602110 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.602148 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf72ae8-b918-4663-9cab-dafb3949d87b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.602161 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nlnk\" (UniqueName: \"kubernetes.io/projected/eaf72ae8-b918-4663-9cab-dafb3949d87b-kube-api-access-9nlnk\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.859608 4707 generic.go:334] "Generic (PLEG): container finished" podID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerID="2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab" exitCode=0 Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.862212 4707 generic.go:334] "Generic (PLEG): container finished" podID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerID="e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d" exitCode=143 Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.859719 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eaf72ae8-b918-4663-9cab-dafb3949d87b","Type":"ContainerDied","Data":"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab"} Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.862501 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eaf72ae8-b918-4663-9cab-dafb3949d87b","Type":"ContainerDied","Data":"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d"} Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.862526 4707 scope.go:117] "RemoveContainer" containerID="2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.859697 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.863020 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eaf72ae8-b918-4663-9cab-dafb3949d87b","Type":"ContainerDied","Data":"807e2f387b77d2114c48d83292a35e060c0b5df73050cbccad8fcfd93e9c13c1"} Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.905333 4707 scope.go:117] "RemoveContainer" containerID="e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.911977 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.941208 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.947940 4707 scope.go:117] "RemoveContainer" containerID="2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab" Nov 27 16:22:36 crc kubenswrapper[4707]: E1127 16:22:36.954945 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab\": container with ID starting with 2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab not found: ID does not exist" containerID="2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.955018 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab"} err="failed to get container status \"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab\": rpc error: code = NotFound desc = could not find container \"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab\": container with ID starting with 2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab not found: ID does not exist" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.955091 4707 scope.go:117] "RemoveContainer" containerID="e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d" Nov 27 16:22:36 crc kubenswrapper[4707]: E1127 16:22:36.955977 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d\": container with ID starting with e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d not found: ID does not exist" containerID="e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.956035 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d"} err="failed to get container status \"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d\": rpc error: code = NotFound desc = could not find container \"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d\": container with ID starting with e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d not found: ID does not exist" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.956110 4707 scope.go:117] "RemoveContainer" containerID="2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.956716 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab"} err="failed to get container status \"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab\": rpc error: code = NotFound desc = could not find container \"2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab\": container with ID starting with 2ec4f4bc0c87ce5c6c2f7dece01a4d9168b853e6497955be3e635be4eb1b8aab not found: ID does not exist" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.956758 4707 scope.go:117] "RemoveContainer" containerID="e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.957001 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d"} err="failed to get container status \"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d\": rpc error: code = NotFound desc = could not find container \"e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d\": container with ID starting with e2f8efd1abba2339020b9df5e225f468fc8f7f0dfa87ba2c563852999c70865d not found: ID does not exist" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.980030 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:36 crc kubenswrapper[4707]: E1127 16:22:36.980671 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerName="nova-metadata-log" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.980706 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerName="nova-metadata-log" Nov 27 16:22:36 crc kubenswrapper[4707]: E1127 16:22:36.980724 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerName="nova-metadata-metadata" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.980733 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerName="nova-metadata-metadata" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.981095 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerName="nova-metadata-log" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.981118 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" containerName="nova-metadata-metadata" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.982494 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.984497 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.984876 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 16:22:36 crc kubenswrapper[4707]: I1127 16:22:36.991799 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.009468 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-logs\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.009587 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.009683 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6v7h\" (UniqueName: \"kubernetes.io/projected/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-kube-api-access-m6v7h\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.009720 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-config-data\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.009836 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.110867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6v7h\" (UniqueName: \"kubernetes.io/projected/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-kube-api-access-m6v7h\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.110907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-config-data\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.110958 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.110996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-logs\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.111069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.111596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-logs\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.115511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.118481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-config-data\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.124766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.127336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6v7h\" (UniqueName: \"kubernetes.io/projected/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-kube-api-access-m6v7h\") pod \"nova-metadata-0\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " pod="openstack/nova-metadata-0" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.216272 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf72ae8-b918-4663-9cab-dafb3949d87b" path="/var/lib/kubelet/pods/eaf72ae8-b918-4663-9cab-dafb3949d87b/volumes" Nov 27 16:22:37 crc kubenswrapper[4707]: I1127 16:22:37.297358 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:38 crc kubenswrapper[4707]: W1127 16:22:38.153598 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0d5989d_b5fa_4474_8ce6_494b1a08b2a9.slice/crio-ac3f39e2bbfde1c263c5d95e7404f03081808cb56a73708b8ce99e07a9f08c7a WatchSource:0}: Error finding container ac3f39e2bbfde1c263c5d95e7404f03081808cb56a73708b8ce99e07a9f08c7a: Status 404 returned error can't find the container with id ac3f39e2bbfde1c263c5d95e7404f03081808cb56a73708b8ce99e07a9f08c7a Nov 27 16:22:38 crc kubenswrapper[4707]: I1127 16:22:38.161188 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:38 crc kubenswrapper[4707]: I1127 16:22:38.892365 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9","Type":"ContainerStarted","Data":"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd"} Nov 27 16:22:38 crc kubenswrapper[4707]: I1127 16:22:38.892680 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9","Type":"ContainerStarted","Data":"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26"} Nov 27 16:22:38 crc kubenswrapper[4707]: I1127 16:22:38.893092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9","Type":"ContainerStarted","Data":"ac3f39e2bbfde1c263c5d95e7404f03081808cb56a73708b8ce99e07a9f08c7a"} Nov 27 16:22:38 crc kubenswrapper[4707]: I1127 16:22:38.938166 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.938136845 podStartE2EDuration="2.938136845s" podCreationTimestamp="2025-11-27 16:22:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:38.920010095 +0000 UTC m=+1134.551458903" watchObservedRunningTime="2025-11-27 16:22:38.938136845 +0000 UTC m=+1134.569585653" Nov 27 16:22:39 crc kubenswrapper[4707]: I1127 16:22:39.906639 4707 generic.go:334] "Generic (PLEG): container finished" podID="831d6788-637b-4a22-8ed9-1f39e8e277a0" containerID="ae09cce4caea83674a979e764de79df6336a8740b86cf3c0607b4da1bd0031ff" exitCode=0 Nov 27 16:22:39 crc kubenswrapper[4707]: I1127 16:22:39.906773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-krw8h" event={"ID":"831d6788-637b-4a22-8ed9-1f39e8e277a0","Type":"ContainerDied","Data":"ae09cce4caea83674a979e764de79df6336a8740b86cf3c0607b4da1bd0031ff"} Nov 27 16:22:40 crc kubenswrapper[4707]: I1127 16:22:40.922948 4707 generic.go:334] "Generic (PLEG): container finished" podID="e14300b4-67fd-4da5-8741-879007e38268" containerID="25ab784e1dfdc9d441fbd5bd3464d8122fb0d1c63df76173637c82adef4308e9" exitCode=0 Nov 27 16:22:40 crc kubenswrapper[4707]: I1127 16:22:40.923052 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" event={"ID":"e14300b4-67fd-4da5-8741-879007e38268","Type":"ContainerDied","Data":"25ab784e1dfdc9d441fbd5bd3464d8122fb0d1c63df76173637c82adef4308e9"} Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.022996 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.023037 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.339307 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.398908 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.399681 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-582z9\" (UniqueName: \"kubernetes.io/projected/831d6788-637b-4a22-8ed9-1f39e8e277a0-kube-api-access-582z9\") pod \"831d6788-637b-4a22-8ed9-1f39e8e277a0\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.399735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-scripts\") pod \"831d6788-637b-4a22-8ed9-1f39e8e277a0\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.399949 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-combined-ca-bundle\") pod \"831d6788-637b-4a22-8ed9-1f39e8e277a0\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.399993 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-config-data\") pod \"831d6788-637b-4a22-8ed9-1f39e8e277a0\" (UID: \"831d6788-637b-4a22-8ed9-1f39e8e277a0\") " Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.407439 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/831d6788-637b-4a22-8ed9-1f39e8e277a0-kube-api-access-582z9" (OuterVolumeSpecName: "kube-api-access-582z9") pod "831d6788-637b-4a22-8ed9-1f39e8e277a0" (UID: "831d6788-637b-4a22-8ed9-1f39e8e277a0"). InnerVolumeSpecName "kube-api-access-582z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.422793 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-scripts" (OuterVolumeSpecName: "scripts") pod "831d6788-637b-4a22-8ed9-1f39e8e277a0" (UID: "831d6788-637b-4a22-8ed9-1f39e8e277a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.438090 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.440816 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-config-data" (OuterVolumeSpecName: "config-data") pod "831d6788-637b-4a22-8ed9-1f39e8e277a0" (UID: "831d6788-637b-4a22-8ed9-1f39e8e277a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.470092 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "831d6788-637b-4a22-8ed9-1f39e8e277a0" (UID: "831d6788-637b-4a22-8ed9-1f39e8e277a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.490522 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.502484 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.502523 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.502537 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-582z9\" (UniqueName: \"kubernetes.io/projected/831d6788-637b-4a22-8ed9-1f39e8e277a0-kube-api-access-582z9\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.502551 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831d6788-637b-4a22-8ed9-1f39e8e277a0-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.556343 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-qkfjh"] Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.556572 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" podUID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" containerName="dnsmasq-dns" containerID="cri-o://eb9134ff6ad850e3518772ea6559c227cd25c608be080fb41935b943c1cd446f" gracePeriod=10 Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.932924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-krw8h" event={"ID":"831d6788-637b-4a22-8ed9-1f39e8e277a0","Type":"ContainerDied","Data":"7fc7ac91f8d2acb8b2b6a1b58dd7c154bfc33805a5b1ff4e900d3c24b84fa7b3"} Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.932995 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc7ac91f8d2acb8b2b6a1b58dd7c154bfc33805a5b1ff4e900d3c24b84fa7b3" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.933047 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-krw8h" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.935612 4707 generic.go:334] "Generic (PLEG): container finished" podID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" containerID="eb9134ff6ad850e3518772ea6559c227cd25c608be080fb41935b943c1cd446f" exitCode=0 Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.935677 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" event={"ID":"254d0a1b-ea0e-4d0c-8a20-fb85542900fb","Type":"ContainerDied","Data":"eb9134ff6ad850e3518772ea6559c227cd25c608be080fb41935b943c1cd446f"} Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.971570 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 16:22:41 crc kubenswrapper[4707]: I1127 16:22:41.990077 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.116955 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-nb\") pod \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.117108 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-sb\") pod \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.117152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-swift-storage-0\") pod \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.117209 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm2kn\" (UniqueName: \"kubernetes.io/projected/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-kube-api-access-xm2kn\") pod \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.117243 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-config\") pod \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.117306 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-svc\") pod \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\" (UID: \"254d0a1b-ea0e-4d0c-8a20-fb85542900fb\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.117949 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.118282 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.131076 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-kube-api-access-xm2kn" (OuterVolumeSpecName: "kube-api-access-xm2kn") pod "254d0a1b-ea0e-4d0c-8a20-fb85542900fb" (UID: "254d0a1b-ea0e-4d0c-8a20-fb85542900fb"). InnerVolumeSpecName "kube-api-access-xm2kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.213925 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-config" (OuterVolumeSpecName: "config") pod "254d0a1b-ea0e-4d0c-8a20-fb85542900fb" (UID: "254d0a1b-ea0e-4d0c-8a20-fb85542900fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.217830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "254d0a1b-ea0e-4d0c-8a20-fb85542900fb" (UID: "254d0a1b-ea0e-4d0c-8a20-fb85542900fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.222435 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm2kn\" (UniqueName: \"kubernetes.io/projected/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-kube-api-access-xm2kn\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.223067 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.223094 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.226739 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "254d0a1b-ea0e-4d0c-8a20-fb85542900fb" (UID: "254d0a1b-ea0e-4d0c-8a20-fb85542900fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.240000 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.240244 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-log" containerID="cri-o://eeaf1d3be99ba07fc12759afdd6dc44013ba560cc093c6ca1634b8dd4174ed1e" gracePeriod=30 Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.242031 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-api" containerID="cri-o://b203f67cd1cbf3949f13ad1cbceb6aaa0cd33cca5ba912d48e685e6d831542d0" gracePeriod=30 Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.243763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "254d0a1b-ea0e-4d0c-8a20-fb85542900fb" (UID: "254d0a1b-ea0e-4d0c-8a20-fb85542900fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.252909 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.253109 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerName="nova-metadata-log" containerID="cri-o://db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26" gracePeriod=30 Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.253486 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerName="nova-metadata-metadata" containerID="cri-o://a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd" gracePeriod=30 Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.256957 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "254d0a1b-ea0e-4d0c-8a20-fb85542900fb" (UID: "254d0a1b-ea0e-4d0c-8a20-fb85542900fb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.298467 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.298694 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.325075 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.325104 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.325114 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/254d0a1b-ea0e-4d0c-8a20-fb85542900fb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.333680 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.530057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-combined-ca-bundle\") pod \"e14300b4-67fd-4da5-8741-879007e38268\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.530147 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j672n\" (UniqueName: \"kubernetes.io/projected/e14300b4-67fd-4da5-8741-879007e38268-kube-api-access-j672n\") pod \"e14300b4-67fd-4da5-8741-879007e38268\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.530214 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-scripts\") pod \"e14300b4-67fd-4da5-8741-879007e38268\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.530236 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-config-data\") pod \"e14300b4-67fd-4da5-8741-879007e38268\" (UID: \"e14300b4-67fd-4da5-8741-879007e38268\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.535726 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14300b4-67fd-4da5-8741-879007e38268-kube-api-access-j672n" (OuterVolumeSpecName: "kube-api-access-j672n") pod "e14300b4-67fd-4da5-8741-879007e38268" (UID: "e14300b4-67fd-4da5-8741-879007e38268"). InnerVolumeSpecName "kube-api-access-j672n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.537545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-scripts" (OuterVolumeSpecName: "scripts") pod "e14300b4-67fd-4da5-8741-879007e38268" (UID: "e14300b4-67fd-4da5-8741-879007e38268"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.562924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e14300b4-67fd-4da5-8741-879007e38268" (UID: "e14300b4-67fd-4da5-8741-879007e38268"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.567345 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-config-data" (OuterVolumeSpecName: "config-data") pod "e14300b4-67fd-4da5-8741-879007e38268" (UID: "e14300b4-67fd-4da5-8741-879007e38268"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.576653 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.632451 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.632480 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.632492 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e14300b4-67fd-4da5-8741-879007e38268-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.632502 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j672n\" (UniqueName: \"kubernetes.io/projected/e14300b4-67fd-4da5-8741-879007e38268-kube-api-access-j672n\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.759448 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.835598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-config-data\") pod \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.835919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-nova-metadata-tls-certs\") pod \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.835953 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6v7h\" (UniqueName: \"kubernetes.io/projected/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-kube-api-access-m6v7h\") pod \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.841015 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-kube-api-access-m6v7h" (OuterVolumeSpecName: "kube-api-access-m6v7h") pod "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" (UID: "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9"). InnerVolumeSpecName "kube-api-access-m6v7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.870682 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-config-data" (OuterVolumeSpecName: "config-data") pod "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" (UID: "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.890182 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" (UID: "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.937815 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-combined-ca-bundle\") pod \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.937987 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-logs\") pod \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\" (UID: \"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9\") " Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.938312 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-logs" (OuterVolumeSpecName: "logs") pod "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" (UID: "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.938885 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.938915 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.938938 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.938957 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6v7h\" (UniqueName: \"kubernetes.io/projected/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-kube-api-access-m6v7h\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.948215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" event={"ID":"254d0a1b-ea0e-4d0c-8a20-fb85542900fb","Type":"ContainerDied","Data":"8af14a426f70539b89c9d9e5f6084516ee18104c0c503d7fd938607fe613ac3f"} Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.948273 4707 scope.go:117] "RemoveContainer" containerID="eb9134ff6ad850e3518772ea6559c227cd25c608be080fb41935b943c1cd446f" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.948391 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-qkfjh" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.951337 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerID="a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd" exitCode=0 Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.951544 4707 generic.go:334] "Generic (PLEG): container finished" podID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerID="db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26" exitCode=143 Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.951622 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9","Type":"ContainerDied","Data":"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd"} Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.951670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9","Type":"ContainerDied","Data":"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26"} Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.951691 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0d5989d-b5fa-4474-8ce6-494b1a08b2a9","Type":"ContainerDied","Data":"ac3f39e2bbfde1c263c5d95e7404f03081808cb56a73708b8ce99e07a9f08c7a"} Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.951772 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.954792 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.954993 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xb5wv" event={"ID":"e14300b4-67fd-4da5-8741-879007e38268","Type":"ContainerDied","Data":"3cdb551e879a36942e67bf05577b637f8a1218e5bfde5fec58a1e94e40765f99"} Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.955804 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cdb551e879a36942e67bf05577b637f8a1218e5bfde5fec58a1e94e40765f99" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.960366 4707 generic.go:334] "Generic (PLEG): container finished" podID="5ee13539-219f-4772-9569-f8526eff8cda" containerID="eeaf1d3be99ba07fc12759afdd6dc44013ba560cc093c6ca1634b8dd4174ed1e" exitCode=143 Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.960478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ee13539-219f-4772-9569-f8526eff8cda","Type":"ContainerDied","Data":"eeaf1d3be99ba07fc12759afdd6dc44013ba560cc093c6ca1634b8dd4174ed1e"} Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.964789 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" (UID: "f0d5989d-b5fa-4474-8ce6-494b1a08b2a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:42 crc kubenswrapper[4707]: I1127 16:22:42.985419 4707 scope.go:117] "RemoveContainer" containerID="e5e380e29c2d6cdefe875b298854cca2d1db3fba3d77ef6ea3722a376005fec5" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.006574 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-qkfjh"] Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.011531 4707 scope.go:117] "RemoveContainer" containerID="a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.045597 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-qkfjh"] Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.045915 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.067066 4707 scope.go:117] "RemoveContainer" containerID="db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.068578 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 16:22:43 crc kubenswrapper[4707]: E1127 16:22:43.068969 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" containerName="dnsmasq-dns" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.068985 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" containerName="dnsmasq-dns" Nov 27 16:22:43 crc kubenswrapper[4707]: E1127 16:22:43.068996 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831d6788-637b-4a22-8ed9-1f39e8e277a0" containerName="nova-manage" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069003 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="831d6788-637b-4a22-8ed9-1f39e8e277a0" containerName="nova-manage" Nov 27 16:22:43 crc kubenswrapper[4707]: E1127 16:22:43.069021 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerName="nova-metadata-log" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069027 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerName="nova-metadata-log" Nov 27 16:22:43 crc kubenswrapper[4707]: E1127 16:22:43.069043 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" containerName="init" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069049 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" containerName="init" Nov 27 16:22:43 crc kubenswrapper[4707]: E1127 16:22:43.069068 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerName="nova-metadata-metadata" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069074 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerName="nova-metadata-metadata" Nov 27 16:22:43 crc kubenswrapper[4707]: E1127 16:22:43.069088 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14300b4-67fd-4da5-8741-879007e38268" containerName="nova-cell1-conductor-db-sync" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069094 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14300b4-67fd-4da5-8741-879007e38268" containerName="nova-cell1-conductor-db-sync" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069251 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerName="nova-metadata-log" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069269 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14300b4-67fd-4da5-8741-879007e38268" containerName="nova-cell1-conductor-db-sync" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069275 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" containerName="nova-metadata-metadata" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069287 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="831d6788-637b-4a22-8ed9-1f39e8e277a0" containerName="nova-manage" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.069297 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" containerName="dnsmasq-dns" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.071945 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.074527 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.075870 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.088073 4707 scope.go:117] "RemoveContainer" containerID="a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd" Nov 27 16:22:43 crc kubenswrapper[4707]: E1127 16:22:43.088634 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd\": container with ID starting with a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd not found: ID does not exist" containerID="a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.088675 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd"} err="failed to get container status \"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd\": rpc error: code = NotFound desc = could not find container \"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd\": container with ID starting with a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd not found: ID does not exist" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.088720 4707 scope.go:117] "RemoveContainer" containerID="db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26" Nov 27 16:22:43 crc kubenswrapper[4707]: E1127 16:22:43.089008 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26\": container with ID starting with db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26 not found: ID does not exist" containerID="db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.089133 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26"} err="failed to get container status \"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26\": rpc error: code = NotFound desc = could not find container \"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26\": container with ID starting with db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26 not found: ID does not exist" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.089152 4707 scope.go:117] "RemoveContainer" containerID="a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.092947 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd"} err="failed to get container status \"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd\": rpc error: code = NotFound desc = could not find container \"a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd\": container with ID starting with a2c64e2ddbe701eb1e3ec24bd94ac702aa5f78212d01f159f23d581117b3dffd not found: ID does not exist" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.092986 4707 scope.go:117] "RemoveContainer" containerID="db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.093534 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26"} err="failed to get container status \"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26\": rpc error: code = NotFound desc = could not find container \"db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26\": container with ID starting with db4d02790d1d4c1e6132f4b03631a5f2be144addbe9656e3c1865cab2c4f2c26 not found: ID does not exist" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.222204 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254d0a1b-ea0e-4d0c-8a20-fb85542900fb" path="/var/lib/kubelet/pods/254d0a1b-ea0e-4d0c-8a20-fb85542900fb/volumes" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.249520 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br42x\" (UniqueName: \"kubernetes.io/projected/af3bfed8-f098-4557-a577-0a10317ee805-kube-api-access-br42x\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.249654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3bfed8-f098-4557-a577-0a10317ee805-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.249678 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3bfed8-f098-4557-a577-0a10317ee805-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.331752 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.359570 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3bfed8-f098-4557-a577-0a10317ee805-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.359617 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3bfed8-f098-4557-a577-0a10317ee805-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.359734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br42x\" (UniqueName: \"kubernetes.io/projected/af3bfed8-f098-4557-a577-0a10317ee805-kube-api-access-br42x\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.365042 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3bfed8-f098-4557-a577-0a10317ee805-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.371783 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.387922 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.389558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.391491 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3bfed8-f098-4557-a577-0a10317ee805-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.392594 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.394415 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.398778 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.408451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br42x\" (UniqueName: \"kubernetes.io/projected/af3bfed8-f098-4557-a577-0a10317ee805-kube-api-access-br42x\") pod \"nova-cell1-conductor-0\" (UID: \"af3bfed8-f098-4557-a577-0a10317ee805\") " pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.461571 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-config-data\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.461733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc32b0f-9bae-4aed-a868-6b06465b570f-logs\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.461782 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.461808 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.461837 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82dwt\" (UniqueName: \"kubernetes.io/projected/cbc32b0f-9bae-4aed-a868-6b06465b570f-kube-api-access-82dwt\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.563965 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc32b0f-9bae-4aed-a868-6b06465b570f-logs\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.564015 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.564040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.564064 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82dwt\" (UniqueName: \"kubernetes.io/projected/cbc32b0f-9bae-4aed-a868-6b06465b570f-kube-api-access-82dwt\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.564125 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-config-data\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.564679 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc32b0f-9bae-4aed-a868-6b06465b570f-logs\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.568052 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-config-data\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.568111 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.574857 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.584923 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82dwt\" (UniqueName: \"kubernetes.io/projected/cbc32b0f-9bae-4aed-a868-6b06465b570f-kube-api-access-82dwt\") pod \"nova-metadata-0\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.693417 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.794835 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:22:43 crc kubenswrapper[4707]: I1127 16:22:43.971289 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" containerName="nova-scheduler-scheduler" containerID="cri-o://2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64" gracePeriod=30 Nov 27 16:22:44 crc kubenswrapper[4707]: W1127 16:22:44.231130 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf3bfed8_f098_4557_a577_0a10317ee805.slice/crio-8273abf8676342475205b76b2c5688a797dbfa0a7115d09bc24e78d4e162f48d WatchSource:0}: Error finding container 8273abf8676342475205b76b2c5688a797dbfa0a7115d09bc24e78d4e162f48d: Status 404 returned error can't find the container with id 8273abf8676342475205b76b2c5688a797dbfa0a7115d09bc24e78d4e162f48d Nov 27 16:22:44 crc kubenswrapper[4707]: I1127 16:22:44.231234 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 16:22:44 crc kubenswrapper[4707]: W1127 16:22:44.328056 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbc32b0f_9bae_4aed_a868_6b06465b570f.slice/crio-44a022643a2f8e0df222056e29419364aaf4426f5259df405a9e259e2b841c62 WatchSource:0}: Error finding container 44a022643a2f8e0df222056e29419364aaf4426f5259df405a9e259e2b841c62: Status 404 returned error can't find the container with id 44a022643a2f8e0df222056e29419364aaf4426f5259df405a9e259e2b841c62 Nov 27 16:22:44 crc kubenswrapper[4707]: I1127 16:22:44.329769 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:22:44 crc kubenswrapper[4707]: I1127 16:22:44.985240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af3bfed8-f098-4557-a577-0a10317ee805","Type":"ContainerStarted","Data":"4312a7c75dc801427d2da39ed9c082e19391af032a7654bc9f940a8cdd10e3a5"} Nov 27 16:22:44 crc kubenswrapper[4707]: I1127 16:22:44.985648 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:44 crc kubenswrapper[4707]: I1127 16:22:44.985672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af3bfed8-f098-4557-a577-0a10317ee805","Type":"ContainerStarted","Data":"8273abf8676342475205b76b2c5688a797dbfa0a7115d09bc24e78d4e162f48d"} Nov 27 16:22:44 crc kubenswrapper[4707]: I1127 16:22:44.987843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbc32b0f-9bae-4aed-a868-6b06465b570f","Type":"ContainerStarted","Data":"238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627"} Nov 27 16:22:44 crc kubenswrapper[4707]: I1127 16:22:44.987889 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbc32b0f-9bae-4aed-a868-6b06465b570f","Type":"ContainerStarted","Data":"ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16"} Nov 27 16:22:44 crc kubenswrapper[4707]: I1127 16:22:44.987908 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbc32b0f-9bae-4aed-a868-6b06465b570f","Type":"ContainerStarted","Data":"44a022643a2f8e0df222056e29419364aaf4426f5259df405a9e259e2b841c62"} Nov 27 16:22:45 crc kubenswrapper[4707]: I1127 16:22:45.017696 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.017679349 podStartE2EDuration="2.017679349s" podCreationTimestamp="2025-11-27 16:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:45.008345132 +0000 UTC m=+1140.639793910" watchObservedRunningTime="2025-11-27 16:22:45.017679349 +0000 UTC m=+1140.649128127" Nov 27 16:22:45 crc kubenswrapper[4707]: I1127 16:22:45.041296 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.041280296 podStartE2EDuration="2.041280296s" podCreationTimestamp="2025-11-27 16:22:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:45.039168043 +0000 UTC m=+1140.670616821" watchObservedRunningTime="2025-11-27 16:22:45.041280296 +0000 UTC m=+1140.672729074" Nov 27 16:22:45 crc kubenswrapper[4707]: I1127 16:22:45.239700 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d5989d-b5fa-4474-8ce6-494b1a08b2a9" path="/var/lib/kubelet/pods/f0d5989d-b5fa-4474-8ce6-494b1a08b2a9/volumes" Nov 27 16:22:46 crc kubenswrapper[4707]: E1127 16:22:46.401399 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 16:22:46 crc kubenswrapper[4707]: E1127 16:22:46.403586 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 16:22:46 crc kubenswrapper[4707]: E1127 16:22:46.405991 4707 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 16:22:46 crc kubenswrapper[4707]: E1127 16:22:46.406048 4707 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" containerName="nova-scheduler-scheduler" Nov 27 16:22:46 crc kubenswrapper[4707]: I1127 16:22:46.946409 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.009084 4707 generic.go:334] "Generic (PLEG): container finished" podID="c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" containerID="2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64" exitCode=0 Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.009184 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0","Type":"ContainerDied","Data":"2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64"} Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.009261 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0","Type":"ContainerDied","Data":"54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566"} Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.009330 4707 scope.go:117] "RemoveContainer" containerID="2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.009906 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.031418 4707 scope.go:117] "RemoveContainer" containerID="2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64" Nov 27 16:22:47 crc kubenswrapper[4707]: E1127 16:22:47.031852 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64\": container with ID starting with 2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64 not found: ID does not exist" containerID="2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.031881 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64"} err="failed to get container status \"2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64\": rpc error: code = NotFound desc = could not find container \"2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64\": container with ID starting with 2f3227f50dfdfefb9dcf820379e8861e5fce7b3c3edd07722b229270863d7c64 not found: ID does not exist" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.036810 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-combined-ca-bundle\") pod \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.036852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-config-data\") pod \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.036886 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99nrp\" (UniqueName: \"kubernetes.io/projected/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-kube-api-access-99nrp\") pod \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\" (UID: \"c280ae8f-cb0b-4035-a7a2-fedfb044a6f0\") " Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.050631 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-kube-api-access-99nrp" (OuterVolumeSpecName: "kube-api-access-99nrp") pod "c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" (UID: "c280ae8f-cb0b-4035-a7a2-fedfb044a6f0"). InnerVolumeSpecName "kube-api-access-99nrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.069042 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" (UID: "c280ae8f-cb0b-4035-a7a2-fedfb044a6f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.082347 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-config-data" (OuterVolumeSpecName: "config-data") pod "c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" (UID: "c280ae8f-cb0b-4035-a7a2-fedfb044a6f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.140922 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.140957 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.140967 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99nrp\" (UniqueName: \"kubernetes.io/projected/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0-kube-api-access-99nrp\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.344717 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.361938 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.378171 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:47 crc kubenswrapper[4707]: E1127 16:22:47.378737 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" containerName="nova-scheduler-scheduler" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.378766 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" containerName="nova-scheduler-scheduler" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.379165 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" containerName="nova-scheduler-scheduler" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.380187 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.384648 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.411634 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.446225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.446291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-config-data\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.446568 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckbnd\" (UniqueName: \"kubernetes.io/projected/a431ee7a-2703-4dea-bba3-2bbdf489e218-kube-api-access-ckbnd\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.548620 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckbnd\" (UniqueName: \"kubernetes.io/projected/a431ee7a-2703-4dea-bba3-2bbdf489e218-kube-api-access-ckbnd\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.548753 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.548788 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-config-data\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.553725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-config-data\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.554038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.566212 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckbnd\" (UniqueName: \"kubernetes.io/projected/a431ee7a-2703-4dea-bba3-2bbdf489e218-kube-api-access-ckbnd\") pod \"nova-scheduler-0\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " pod="openstack/nova-scheduler-0" Nov 27 16:22:47 crc kubenswrapper[4707]: I1127 16:22:47.710404 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.019999 4707 generic.go:334] "Generic (PLEG): container finished" podID="5ee13539-219f-4772-9569-f8526eff8cda" containerID="b203f67cd1cbf3949f13ad1cbceb6aaa0cd33cca5ba912d48e685e6d831542d0" exitCode=0 Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.020061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ee13539-219f-4772-9569-f8526eff8cda","Type":"ContainerDied","Data":"b203f67cd1cbf3949f13ad1cbceb6aaa0cd33cca5ba912d48e685e6d831542d0"} Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.020089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5ee13539-219f-4772-9569-f8526eff8cda","Type":"ContainerDied","Data":"b316993f7deb2fd7dcfb3d4589cc9124291bdc0eaefd0191e40272ab2268c87d"} Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.020101 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b316993f7deb2fd7dcfb3d4589cc9124291bdc0eaefd0191e40272ab2268c87d" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.053891 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.165502 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-config-data\") pod \"5ee13539-219f-4772-9569-f8526eff8cda\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.165544 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee13539-219f-4772-9569-f8526eff8cda-logs\") pod \"5ee13539-219f-4772-9569-f8526eff8cda\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.165772 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-combined-ca-bundle\") pod \"5ee13539-219f-4772-9569-f8526eff8cda\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.165793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l4jk\" (UniqueName: \"kubernetes.io/projected/5ee13539-219f-4772-9569-f8526eff8cda-kube-api-access-4l4jk\") pod \"5ee13539-219f-4772-9569-f8526eff8cda\" (UID: \"5ee13539-219f-4772-9569-f8526eff8cda\") " Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.166270 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ee13539-219f-4772-9569-f8526eff8cda-logs" (OuterVolumeSpecName: "logs") pod "5ee13539-219f-4772-9569-f8526eff8cda" (UID: "5ee13539-219f-4772-9569-f8526eff8cda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.170738 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee13539-219f-4772-9569-f8526eff8cda-kube-api-access-4l4jk" (OuterVolumeSpecName: "kube-api-access-4l4jk") pod "5ee13539-219f-4772-9569-f8526eff8cda" (UID: "5ee13539-219f-4772-9569-f8526eff8cda"). InnerVolumeSpecName "kube-api-access-4l4jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.194609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ee13539-219f-4772-9569-f8526eff8cda" (UID: "5ee13539-219f-4772-9569-f8526eff8cda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.194710 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-config-data" (OuterVolumeSpecName: "config-data") pod "5ee13539-219f-4772-9569-f8526eff8cda" (UID: "5ee13539-219f-4772-9569-f8526eff8cda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.247727 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.268694 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.268749 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ee13539-219f-4772-9569-f8526eff8cda-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.268758 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee13539-219f-4772-9569-f8526eff8cda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.268770 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l4jk\" (UniqueName: \"kubernetes.io/projected/5ee13539-219f-4772-9569-f8526eff8cda-kube-api-access-4l4jk\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.795120 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 16:22:48 crc kubenswrapper[4707]: I1127 16:22:48.795186 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.041130 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.041134 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a431ee7a-2703-4dea-bba3-2bbdf489e218","Type":"ContainerStarted","Data":"196432ce75d91f6078033019944cd5cc7c22ba2600ee9338c8e07f65f6237ca8"} Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.041734 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a431ee7a-2703-4dea-bba3-2bbdf489e218","Type":"ContainerStarted","Data":"3daa6645f127bac9aec43e53c32b26e69749a63b7eae898b08ad144eeadf16a8"} Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.085463 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.085438588 podStartE2EDuration="2.085438588s" podCreationTimestamp="2025-11-27 16:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:49.062737513 +0000 UTC m=+1144.694186331" watchObservedRunningTime="2025-11-27 16:22:49.085438588 +0000 UTC m=+1144.716887396" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.108081 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.124996 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.140450 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:49 crc kubenswrapper[4707]: E1127 16:22:49.141056 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-log" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.141085 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-log" Nov 27 16:22:49 crc kubenswrapper[4707]: E1127 16:22:49.141144 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-api" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.141158 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-api" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.141516 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-log" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.141558 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee13539-219f-4772-9569-f8526eff8cda" containerName="nova-api-api" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.143528 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.146511 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.184762 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.187107 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-config-data\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.187179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1272cc-2e78-4c7e-abf5-56c30b17a717-logs\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.187204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4m62\" (UniqueName: \"kubernetes.io/projected/7f1272cc-2e78-4c7e-abf5-56c30b17a717-kube-api-access-d4m62\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.187291 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.207418 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee13539-219f-4772-9569-f8526eff8cda" path="/var/lib/kubelet/pods/5ee13539-219f-4772-9569-f8526eff8cda/volumes" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.208836 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c280ae8f-cb0b-4035-a7a2-fedfb044a6f0" path="/var/lib/kubelet/pods/c280ae8f-cb0b-4035-a7a2-fedfb044a6f0/volumes" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.287980 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-config-data\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.288046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1272cc-2e78-4c7e-abf5-56c30b17a717-logs\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.288070 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4m62\" (UniqueName: \"kubernetes.io/projected/7f1272cc-2e78-4c7e-abf5-56c30b17a717-kube-api-access-d4m62\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.288157 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.288834 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1272cc-2e78-4c7e-abf5-56c30b17a717-logs\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.291823 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.304754 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4m62\" (UniqueName: \"kubernetes.io/projected/7f1272cc-2e78-4c7e-abf5-56c30b17a717-kube-api-access-d4m62\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.305384 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-config-data\") pod \"nova-api-0\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.480363 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:22:49 crc kubenswrapper[4707]: I1127 16:22:49.944151 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:22:50 crc kubenswrapper[4707]: I1127 16:22:50.055362 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7f1272cc-2e78-4c7e-abf5-56c30b17a717","Type":"ContainerStarted","Data":"09b28ea7ce0f884e6de418cfaa5107d73c243444334608157817c55dd1388df9"} Nov 27 16:22:50 crc kubenswrapper[4707]: I1127 16:22:50.992075 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 27 16:22:51 crc kubenswrapper[4707]: I1127 16:22:51.069661 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7f1272cc-2e78-4c7e-abf5-56c30b17a717","Type":"ContainerStarted","Data":"bc2e266c1a533e2d150cd20661defbc1ddcfc750d68590d297dea9099a539d06"} Nov 27 16:22:51 crc kubenswrapper[4707]: I1127 16:22:51.070030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7f1272cc-2e78-4c7e-abf5-56c30b17a717","Type":"ContainerStarted","Data":"8c80e27b4be2b856c64ad038bb4e813389a32360df4a1f008482ea68bf8b70ab"} Nov 27 16:22:51 crc kubenswrapper[4707]: I1127 16:22:51.096343 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.096323201 podStartE2EDuration="2.096323201s" podCreationTimestamp="2025-11-27 16:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:22:51.094001002 +0000 UTC m=+1146.725449810" watchObservedRunningTime="2025-11-27 16:22:51.096323201 +0000 UTC m=+1146.727771979" Nov 27 16:22:52 crc kubenswrapper[4707]: I1127 16:22:52.711409 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 16:22:53 crc kubenswrapper[4707]: I1127 16:22:53.727804 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 27 16:22:53 crc kubenswrapper[4707]: I1127 16:22:53.795737 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 16:22:53 crc kubenswrapper[4707]: I1127 16:22:53.795788 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 16:22:54 crc kubenswrapper[4707]: I1127 16:22:54.612475 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:22:54 crc kubenswrapper[4707]: I1127 16:22:54.612734 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="da629d4e-1b93-4623-a577-419c6dca0f14" containerName="kube-state-metrics" containerID="cri-o://02800a9fdfba6d6e27d3309db4a1bf5af39a66efb48603760535b9c096826729" gracePeriod=30 Nov 27 16:22:54 crc kubenswrapper[4707]: I1127 16:22:54.805501 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 16:22:54 crc kubenswrapper[4707]: I1127 16:22:54.805583 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 16:22:55 crc kubenswrapper[4707]: I1127 16:22:55.118228 4707 generic.go:334] "Generic (PLEG): container finished" podID="da629d4e-1b93-4623-a577-419c6dca0f14" containerID="02800a9fdfba6d6e27d3309db4a1bf5af39a66efb48603760535b9c096826729" exitCode=2 Nov 27 16:22:55 crc kubenswrapper[4707]: I1127 16:22:55.118687 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"da629d4e-1b93-4623-a577-419c6dca0f14","Type":"ContainerDied","Data":"02800a9fdfba6d6e27d3309db4a1bf5af39a66efb48603760535b9c096826729"} Nov 27 16:22:55 crc kubenswrapper[4707]: I1127 16:22:55.118718 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"da629d4e-1b93-4623-a577-419c6dca0f14","Type":"ContainerDied","Data":"74835a2247bf6b616df3b3f8299ac949241a86ff395a5463aeea31a66efc5660"} Nov 27 16:22:55 crc kubenswrapper[4707]: I1127 16:22:55.118731 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74835a2247bf6b616df3b3f8299ac949241a86ff395a5463aeea31a66efc5660" Nov 27 16:22:55 crc kubenswrapper[4707]: I1127 16:22:55.158903 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 16:22:55 crc kubenswrapper[4707]: I1127 16:22:55.318413 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cjwd\" (UniqueName: \"kubernetes.io/projected/da629d4e-1b93-4623-a577-419c6dca0f14-kube-api-access-8cjwd\") pod \"da629d4e-1b93-4623-a577-419c6dca0f14\" (UID: \"da629d4e-1b93-4623-a577-419c6dca0f14\") " Nov 27 16:22:55 crc kubenswrapper[4707]: I1127 16:22:55.325508 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da629d4e-1b93-4623-a577-419c6dca0f14-kube-api-access-8cjwd" (OuterVolumeSpecName: "kube-api-access-8cjwd") pod "da629d4e-1b93-4623-a577-419c6dca0f14" (UID: "da629d4e-1b93-4623-a577-419c6dca0f14"). InnerVolumeSpecName "kube-api-access-8cjwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:55 crc kubenswrapper[4707]: I1127 16:22:55.421261 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cjwd\" (UniqueName: \"kubernetes.io/projected/da629d4e-1b93-4623-a577-419c6dca0f14-kube-api-access-8cjwd\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.125689 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.152047 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.161398 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.175157 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:22:56 crc kubenswrapper[4707]: E1127 16:22:56.175597 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da629d4e-1b93-4623-a577-419c6dca0f14" containerName="kube-state-metrics" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.175709 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="da629d4e-1b93-4623-a577-419c6dca0f14" containerName="kube-state-metrics" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.175932 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="da629d4e-1b93-4623-a577-419c6dca0f14" containerName="kube-state-metrics" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.176653 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.180721 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.180912 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.184072 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.340203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.340275 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.340355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt4w2\" (UniqueName: \"kubernetes.io/projected/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-api-access-tt4w2\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.340457 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.442925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.443041 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.443148 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt4w2\" (UniqueName: \"kubernetes.io/projected/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-api-access-tt4w2\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.443269 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.450187 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.450223 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.455131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.474709 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt4w2\" (UniqueName: \"kubernetes.io/projected/fb6d94a9-d059-4cba-a6f3-8590d2491bb2-kube-api-access-tt4w2\") pod \"kube-state-metrics-0\" (UID: \"fb6d94a9-d059-4cba-a6f3-8590d2491bb2\") " pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.491582 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.555422 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.555867 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="ceilometer-central-agent" containerID="cri-o://5fc9393f0398128968478a010f734a44a7d80a0a41530387e666885746655785" gracePeriod=30 Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.555946 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="proxy-httpd" containerID="cri-o://54b0d26da6d3938b0707501bba757eec138a57173cbf49ddd52608443a2b6d6b" gracePeriod=30 Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.556031 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="sg-core" containerID="cri-o://c5a304ab8003e81706e7b2bc635b4f2adaa9b26594b89f6545d1197761735b7a" gracePeriod=30 Nov 27 16:22:56 crc kubenswrapper[4707]: I1127 16:22:56.556111 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="ceilometer-notification-agent" containerID="cri-o://084cc809119de06cc4756181a129e012ab9a699af74b7760e9de52913ee9d79a" gracePeriod=30 Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.022456 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.139974 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerID="54b0d26da6d3938b0707501bba757eec138a57173cbf49ddd52608443a2b6d6b" exitCode=0 Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.140011 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerID="c5a304ab8003e81706e7b2bc635b4f2adaa9b26594b89f6545d1197761735b7a" exitCode=2 Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.140024 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerID="5fc9393f0398128968478a010f734a44a7d80a0a41530387e666885746655785" exitCode=0 Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.140063 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerDied","Data":"54b0d26da6d3938b0707501bba757eec138a57173cbf49ddd52608443a2b6d6b"} Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.140111 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerDied","Data":"c5a304ab8003e81706e7b2bc635b4f2adaa9b26594b89f6545d1197761735b7a"} Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.140585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerDied","Data":"5fc9393f0398128968478a010f734a44a7d80a0a41530387e666885746655785"} Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.143456 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fb6d94a9-d059-4cba-a6f3-8590d2491bb2","Type":"ContainerStarted","Data":"469950bb0c0b8b6eae6a71dc20f789c8a2ec0e1d5195110ec9c18dc1df563a47"} Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.208786 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da629d4e-1b93-4623-a577-419c6dca0f14" path="/var/lib/kubelet/pods/da629d4e-1b93-4623-a577-419c6dca0f14/volumes" Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.712024 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 16:22:57 crc kubenswrapper[4707]: I1127 16:22:57.749448 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 16:22:58 crc kubenswrapper[4707]: I1127 16:22:58.157657 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fb6d94a9-d059-4cba-a6f3-8590d2491bb2","Type":"ContainerStarted","Data":"4acceb9a4c8f0f80f921a0e7a5d5560c9ee637a9947768cee7b027ae3c1487f3"} Nov 27 16:22:58 crc kubenswrapper[4707]: I1127 16:22:58.183882 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.832858927 podStartE2EDuration="2.183856236s" podCreationTimestamp="2025-11-27 16:22:56 +0000 UTC" firstStartedPulling="2025-11-27 16:22:57.033024938 +0000 UTC m=+1152.664473716" lastFinishedPulling="2025-11-27 16:22:57.384022217 +0000 UTC m=+1153.015471025" observedRunningTime="2025-11-27 16:22:58.177183547 +0000 UTC m=+1153.808632355" watchObservedRunningTime="2025-11-27 16:22:58.183856236 +0000 UTC m=+1153.815305034" Nov 27 16:22:58 crc kubenswrapper[4707]: I1127 16:22:58.212691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.183627 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerID="084cc809119de06cc4756181a129e012ab9a699af74b7760e9de52913ee9d79a" exitCode=0 Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.186209 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerDied","Data":"084cc809119de06cc4756181a129e012ab9a699af74b7760e9de52913ee9d79a"} Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.187519 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.322087 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.399039 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-log-httpd\") pod \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.399094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsfcn\" (UniqueName: \"kubernetes.io/projected/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-kube-api-access-qsfcn\") pod \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.399171 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-sg-core-conf-yaml\") pod \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.399194 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-run-httpd\") pod \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.399282 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-config-data\") pod \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.399321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-combined-ca-bundle\") pod \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.399489 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-scripts\") pod \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\" (UID: \"f3677d27-3ac0-4b22-a3c0-ffe82b37b437\") " Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.399505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3677d27-3ac0-4b22-a3c0-ffe82b37b437" (UID: "f3677d27-3ac0-4b22-a3c0-ffe82b37b437"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.400074 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.400213 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3677d27-3ac0-4b22-a3c0-ffe82b37b437" (UID: "f3677d27-3ac0-4b22-a3c0-ffe82b37b437"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.404026 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-scripts" (OuterVolumeSpecName: "scripts") pod "f3677d27-3ac0-4b22-a3c0-ffe82b37b437" (UID: "f3677d27-3ac0-4b22-a3c0-ffe82b37b437"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.412149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-kube-api-access-qsfcn" (OuterVolumeSpecName: "kube-api-access-qsfcn") pod "f3677d27-3ac0-4b22-a3c0-ffe82b37b437" (UID: "f3677d27-3ac0-4b22-a3c0-ffe82b37b437"). InnerVolumeSpecName "kube-api-access-qsfcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.440276 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f3677d27-3ac0-4b22-a3c0-ffe82b37b437" (UID: "f3677d27-3ac0-4b22-a3c0-ffe82b37b437"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.480980 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.481038 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.495169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3677d27-3ac0-4b22-a3c0-ffe82b37b437" (UID: "f3677d27-3ac0-4b22-a3c0-ffe82b37b437"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.505449 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.505626 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsfcn\" (UniqueName: \"kubernetes.io/projected/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-kube-api-access-qsfcn\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.505688 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.505760 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.505815 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.515549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-config-data" (OuterVolumeSpecName: "config-data") pod "f3677d27-3ac0-4b22-a3c0-ffe82b37b437" (UID: "f3677d27-3ac0-4b22-a3c0-ffe82b37b437"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:22:59 crc kubenswrapper[4707]: I1127 16:22:59.607538 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3677d27-3ac0-4b22-a3c0-ffe82b37b437-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.196784 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3677d27-3ac0-4b22-a3c0-ffe82b37b437","Type":"ContainerDied","Data":"b5aa0adfaeb0ef8de3a51e09e850ef3abd88bbc443f4ec114f5250886a209ef6"} Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.197173 4707 scope.go:117] "RemoveContainer" containerID="54b0d26da6d3938b0707501bba757eec138a57173cbf49ddd52608443a2b6d6b" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.196919 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.219881 4707 scope.go:117] "RemoveContainer" containerID="c5a304ab8003e81706e7b2bc635b4f2adaa9b26594b89f6545d1197761735b7a" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.228024 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.239464 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.241524 4707 scope.go:117] "RemoveContainer" containerID="084cc809119de06cc4756181a129e012ab9a699af74b7760e9de52913ee9d79a" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261317 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:00 crc kubenswrapper[4707]: E1127 16:23:00.261674 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="ceilometer-central-agent" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261690 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="ceilometer-central-agent" Nov 27 16:23:00 crc kubenswrapper[4707]: E1127 16:23:00.261707 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="proxy-httpd" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261713 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="proxy-httpd" Nov 27 16:23:00 crc kubenswrapper[4707]: E1127 16:23:00.261726 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="sg-core" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261732 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="sg-core" Nov 27 16:23:00 crc kubenswrapper[4707]: E1127 16:23:00.261750 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="ceilometer-notification-agent" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261756 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="ceilometer-notification-agent" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261940 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="sg-core" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261956 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="ceilometer-central-agent" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261970 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="proxy-httpd" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.261979 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" containerName="ceilometer-notification-agent" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.263484 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.265131 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.270828 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.274679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.274898 4707 scope.go:117] "RemoveContainer" containerID="5fc9393f0398128968478a010f734a44a7d80a0a41530387e666885746655785" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.288752 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.421432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-scripts\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.421521 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.421553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjk68\" (UniqueName: \"kubernetes.io/projected/0e3db829-032c-4a56-9e60-5a332017caca-kube-api-access-vjk68\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.421585 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-log-httpd\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.421684 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.421897 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.421984 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-config-data\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.422105 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-run-httpd\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.521740 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.528339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-scripts\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.528526 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.528582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjk68\" (UniqueName: \"kubernetes.io/projected/0e3db829-032c-4a56-9e60-5a332017caca-kube-api-access-vjk68\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.528655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-log-httpd\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.528693 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.528855 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.528956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-config-data\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.528983 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-run-httpd\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.529574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-log-httpd\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.529596 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-run-httpd\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.541232 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.541467 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-scripts\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.541460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.541807 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-config-data\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.543107 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.547040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjk68\" (UniqueName: \"kubernetes.io/projected/0e3db829-032c-4a56-9e60-5a332017caca-kube-api-access-vjk68\") pod \"ceilometer-0\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " pod="openstack/ceilometer-0" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.563588 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 16:23:00 crc kubenswrapper[4707]: I1127 16:23:00.607858 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:01 crc kubenswrapper[4707]: I1127 16:23:01.104857 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:01 crc kubenswrapper[4707]: I1127 16:23:01.206808 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3677d27-3ac0-4b22-a3c0-ffe82b37b437" path="/var/lib/kubelet/pods/f3677d27-3ac0-4b22-a3c0-ffe82b37b437/volumes" Nov 27 16:23:01 crc kubenswrapper[4707]: I1127 16:23:01.213915 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerStarted","Data":"20e9c16484deaecf0a42ea847e69aef1e03e1e05d990a18867a56ada17a40ee8"} Nov 27 16:23:03 crc kubenswrapper[4707]: I1127 16:23:03.268304 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerStarted","Data":"55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a"} Nov 27 16:23:03 crc kubenswrapper[4707]: I1127 16:23:03.268858 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerStarted","Data":"28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e"} Nov 27 16:23:03 crc kubenswrapper[4707]: I1127 16:23:03.623469 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:23:03 crc kubenswrapper[4707]: I1127 16:23:03.623531 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:23:03 crc kubenswrapper[4707]: I1127 16:23:03.811569 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 16:23:03 crc kubenswrapper[4707]: I1127 16:23:03.812002 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 16:23:03 crc kubenswrapper[4707]: I1127 16:23:03.820643 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 16:23:04 crc kubenswrapper[4707]: I1127 16:23:04.281529 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerStarted","Data":"8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae"} Nov 27 16:23:04 crc kubenswrapper[4707]: I1127 16:23:04.287445 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 16:23:06 crc kubenswrapper[4707]: E1127 16:23:06.141958 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice/crio-conmon-084cc809119de06cc4756181a129e012ab9a699af74b7760e9de52913ee9d79a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice/crio-54b0d26da6d3938b0707501bba757eec138a57173cbf49ddd52608443a2b6d6b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice/crio-conmon-5fc9393f0398128968478a010f734a44a7d80a0a41530387e666885746655785.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice/crio-conmon-54b0d26da6d3938b0707501bba757eec138a57173cbf49ddd52608443a2b6d6b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice/crio-b5aa0adfaeb0ef8de3a51e09e850ef3abd88bbc443f4ec114f5250886a209ef6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice/crio-084cc809119de06cc4756181a129e012ab9a699af74b7760e9de52913ee9d79a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb68c751_abf5_49ac_985b_e0e9c3a7493f.slice/crio-conmon-39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc280ae8f_cb0b_4035_a7a2_fedfb044a6f0.slice/crio-54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice/crio-c5a304ab8003e81706e7b2bc635b4f2adaa9b26594b89f6545d1197761735b7a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3677d27_3ac0_4b22_a3c0_ffe82b37b437.slice/crio-5fc9393f0398128968478a010f734a44a7d80a0a41530387e666885746655785.scope\": RecentStats: unable to find data in memory cache]" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.268091 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.306213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerStarted","Data":"feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36"} Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.306328 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.313616 4707 generic.go:334] "Generic (PLEG): container finished" podID="eb68c751-abf5-49ac-985b-e0e9c3a7493f" containerID="39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445" exitCode=137 Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.313646 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eb68c751-abf5-49ac-985b-e0e9c3a7493f","Type":"ContainerDied","Data":"39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445"} Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.313673 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"eb68c751-abf5-49ac-985b-e0e9c3a7493f","Type":"ContainerDied","Data":"8a749f1aee0961603663152a93d69884067112ada39ab9fc4d50c8c071cf3cfe"} Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.313689 4707 scope.go:117] "RemoveContainer" containerID="39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.313685 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.335757 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.585697289 podStartE2EDuration="6.33573888s" podCreationTimestamp="2025-11-27 16:23:00 +0000 UTC" firstStartedPulling="2025-11-27 16:23:01.124920448 +0000 UTC m=+1156.756369216" lastFinishedPulling="2025-11-27 16:23:05.874962019 +0000 UTC m=+1161.506410807" observedRunningTime="2025-11-27 16:23:06.327736647 +0000 UTC m=+1161.959185415" watchObservedRunningTime="2025-11-27 16:23:06.33573888 +0000 UTC m=+1161.967187648" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.343784 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmwmj\" (UniqueName: \"kubernetes.io/projected/eb68c751-abf5-49ac-985b-e0e9c3a7493f-kube-api-access-xmwmj\") pod \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.343947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-combined-ca-bundle\") pod \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.344107 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-config-data\") pod \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\" (UID: \"eb68c751-abf5-49ac-985b-e0e9c3a7493f\") " Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.345119 4707 scope.go:117] "RemoveContainer" containerID="39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445" Nov 27 16:23:06 crc kubenswrapper[4707]: E1127 16:23:06.349519 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445\": container with ID starting with 39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445 not found: ID does not exist" containerID="39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.349557 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445"} err="failed to get container status \"39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445\": rpc error: code = NotFound desc = could not find container \"39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445\": container with ID starting with 39c7cd8971622424854d0f1fb1a723fac75d832275310708817d4bc2f00c3445 not found: ID does not exist" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.349530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb68c751-abf5-49ac-985b-e0e9c3a7493f-kube-api-access-xmwmj" (OuterVolumeSpecName: "kube-api-access-xmwmj") pod "eb68c751-abf5-49ac-985b-e0e9c3a7493f" (UID: "eb68c751-abf5-49ac-985b-e0e9c3a7493f"). InnerVolumeSpecName "kube-api-access-xmwmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.368456 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-config-data" (OuterVolumeSpecName: "config-data") pod "eb68c751-abf5-49ac-985b-e0e9c3a7493f" (UID: "eb68c751-abf5-49ac-985b-e0e9c3a7493f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.370119 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb68c751-abf5-49ac-985b-e0e9c3a7493f" (UID: "eb68c751-abf5-49ac-985b-e0e9c3a7493f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.446643 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.446871 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb68c751-abf5-49ac-985b-e0e9c3a7493f-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.446881 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmwmj\" (UniqueName: \"kubernetes.io/projected/eb68c751-abf5-49ac-985b-e0e9c3a7493f-kube-api-access-xmwmj\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.503903 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.646435 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.649685 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.667417 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:23:06 crc kubenswrapper[4707]: E1127 16:23:06.667842 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb68c751-abf5-49ac-985b-e0e9c3a7493f" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.667859 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb68c751-abf5-49ac-985b-e0e9c3a7493f" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.668042 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb68c751-abf5-49ac-985b-e0e9c3a7493f" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.668685 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.670600 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.672610 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.672718 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.685235 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.751998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.752441 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.752733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rrms\" (UniqueName: \"kubernetes.io/projected/bd58d9e8-77d0-412d-b866-c10f989dc824-kube-api-access-7rrms\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.752867 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.752989 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.854785 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.854849 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rrms\" (UniqueName: \"kubernetes.io/projected/bd58d9e8-77d0-412d-b866-c10f989dc824-kube-api-access-7rrms\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.854910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.854956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.854990 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.861544 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.864629 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.865849 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.866316 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd58d9e8-77d0-412d-b866-c10f989dc824-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.876273 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rrms\" (UniqueName: \"kubernetes.io/projected/bd58d9e8-77d0-412d-b866-c10f989dc824-kube-api-access-7rrms\") pod \"nova-cell1-novncproxy-0\" (UID: \"bd58d9e8-77d0-412d-b866-c10f989dc824\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:06 crc kubenswrapper[4707]: I1127 16:23:06.990895 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:07 crc kubenswrapper[4707]: I1127 16:23:07.210012 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb68c751-abf5-49ac-985b-e0e9c3a7493f" path="/var/lib/kubelet/pods/eb68c751-abf5-49ac-985b-e0e9c3a7493f/volumes" Nov 27 16:23:07 crc kubenswrapper[4707]: I1127 16:23:07.500257 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 16:23:07 crc kubenswrapper[4707]: W1127 16:23:07.502486 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd58d9e8_77d0_412d_b866_c10f989dc824.slice/crio-41a8d2c136e643029fa9dd5340f9618cd129c1ca9f840a0e8b64f247c9806b64 WatchSource:0}: Error finding container 41a8d2c136e643029fa9dd5340f9618cd129c1ca9f840a0e8b64f247c9806b64: Status 404 returned error can't find the container with id 41a8d2c136e643029fa9dd5340f9618cd129c1ca9f840a0e8b64f247c9806b64 Nov 27 16:23:08 crc kubenswrapper[4707]: I1127 16:23:08.335933 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bd58d9e8-77d0-412d-b866-c10f989dc824","Type":"ContainerStarted","Data":"03f69499a1c8f3332e15008d2e91d57033f23c12d498e83f3ff396cf63e2457c"} Nov 27 16:23:08 crc kubenswrapper[4707]: I1127 16:23:08.336204 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bd58d9e8-77d0-412d-b866-c10f989dc824","Type":"ContainerStarted","Data":"41a8d2c136e643029fa9dd5340f9618cd129c1ca9f840a0e8b64f247c9806b64"} Nov 27 16:23:08 crc kubenswrapper[4707]: I1127 16:23:08.359800 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.359775755 podStartE2EDuration="2.359775755s" podCreationTimestamp="2025-11-27 16:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:23:08.350071349 +0000 UTC m=+1163.981520147" watchObservedRunningTime="2025-11-27 16:23:08.359775755 +0000 UTC m=+1163.991224563" Nov 27 16:23:09 crc kubenswrapper[4707]: I1127 16:23:09.488113 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 16:23:09 crc kubenswrapper[4707]: I1127 16:23:09.488729 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 16:23:09 crc kubenswrapper[4707]: I1127 16:23:09.490718 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 16:23:09 crc kubenswrapper[4707]: I1127 16:23:09.494258 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.361225 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.367266 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.568321 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hmvth"] Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.570249 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.581291 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hmvth"] Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.635155 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pndvt\" (UniqueName: \"kubernetes.io/projected/bc688de8-7455-42c0-94e5-caada732ba70-kube-api-access-pndvt\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.635202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.635237 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.635733 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.635934 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.635978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.737897 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.737996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.738030 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.738076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pndvt\" (UniqueName: \"kubernetes.io/projected/bc688de8-7455-42c0-94e5-caada732ba70-kube-api-access-pndvt\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.738103 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.738142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.739171 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.739314 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.741174 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.741192 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-config\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.741511 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.783247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pndvt\" (UniqueName: \"kubernetes.io/projected/bc688de8-7455-42c0-94e5-caada732ba70-kube-api-access-pndvt\") pod \"dnsmasq-dns-6b7bbf7cf9-hmvth\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:10 crc kubenswrapper[4707]: I1127 16:23:10.887848 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:11 crc kubenswrapper[4707]: I1127 16:23:11.349259 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hmvth"] Nov 27 16:23:11 crc kubenswrapper[4707]: I1127 16:23:11.375554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" event={"ID":"bc688de8-7455-42c0-94e5-caada732ba70","Type":"ContainerStarted","Data":"482c1ae93b202df8cc3477b968ee316b94d576887154ff4a4ee49e2589868b42"} Nov 27 16:23:11 crc kubenswrapper[4707]: I1127 16:23:11.991954 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:12 crc kubenswrapper[4707]: I1127 16:23:12.384297 4707 generic.go:334] "Generic (PLEG): container finished" podID="bc688de8-7455-42c0-94e5-caada732ba70" containerID="b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56" exitCode=0 Nov 27 16:23:12 crc kubenswrapper[4707]: I1127 16:23:12.384391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" event={"ID":"bc688de8-7455-42c0-94e5-caada732ba70","Type":"ContainerDied","Data":"b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56"} Nov 27 16:23:12 crc kubenswrapper[4707]: I1127 16:23:12.767151 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:12 crc kubenswrapper[4707]: I1127 16:23:12.767970 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="ceilometer-central-agent" containerID="cri-o://28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e" gracePeriod=30 Nov 27 16:23:12 crc kubenswrapper[4707]: I1127 16:23:12.768166 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="proxy-httpd" containerID="cri-o://feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36" gracePeriod=30 Nov 27 16:23:12 crc kubenswrapper[4707]: I1127 16:23:12.768214 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="sg-core" containerID="cri-o://8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae" gracePeriod=30 Nov 27 16:23:12 crc kubenswrapper[4707]: I1127 16:23:12.768308 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="ceilometer-notification-agent" containerID="cri-o://55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a" gracePeriod=30 Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.398414 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e3db829-032c-4a56-9e60-5a332017caca" containerID="feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36" exitCode=0 Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.398723 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e3db829-032c-4a56-9e60-5a332017caca" containerID="8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae" exitCode=2 Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.398735 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e3db829-032c-4a56-9e60-5a332017caca" containerID="28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e" exitCode=0 Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.398545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerDied","Data":"feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36"} Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.398798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerDied","Data":"8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae"} Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.398825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerDied","Data":"28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e"} Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.400996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" event={"ID":"bc688de8-7455-42c0-94e5-caada732ba70","Type":"ContainerStarted","Data":"edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac"} Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.401203 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:13 crc kubenswrapper[4707]: I1127 16:23:13.432960 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" podStartSLOduration=3.432933544 podStartE2EDuration="3.432933544s" podCreationTimestamp="2025-11-27 16:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:23:13.424725565 +0000 UTC m=+1169.056174373" watchObservedRunningTime="2025-11-27 16:23:13.432933544 +0000 UTC m=+1169.064382352" Nov 27 16:23:14 crc kubenswrapper[4707]: I1127 16:23:14.134235 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:14 crc kubenswrapper[4707]: I1127 16:23:14.134828 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-log" containerID="cri-o://8c80e27b4be2b856c64ad038bb4e813389a32360df4a1f008482ea68bf8b70ab" gracePeriod=30 Nov 27 16:23:14 crc kubenswrapper[4707]: I1127 16:23:14.134906 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-api" containerID="cri-o://bc2e266c1a533e2d150cd20661defbc1ddcfc750d68590d297dea9099a539d06" gracePeriod=30 Nov 27 16:23:14 crc kubenswrapper[4707]: I1127 16:23:14.422398 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerID="8c80e27b4be2b856c64ad038bb4e813389a32360df4a1f008482ea68bf8b70ab" exitCode=143 Nov 27 16:23:14 crc kubenswrapper[4707]: I1127 16:23:14.422477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7f1272cc-2e78-4c7e-abf5-56c30b17a717","Type":"ContainerDied","Data":"8c80e27b4be2b856c64ad038bb4e813389a32360df4a1f008482ea68bf8b70ab"} Nov 27 16:23:14 crc kubenswrapper[4707]: I1127 16:23:14.887466 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.017905 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-sg-core-conf-yaml\") pod \"0e3db829-032c-4a56-9e60-5a332017caca\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.017978 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-scripts\") pod \"0e3db829-032c-4a56-9e60-5a332017caca\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.018050 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjk68\" (UniqueName: \"kubernetes.io/projected/0e3db829-032c-4a56-9e60-5a332017caca-kube-api-access-vjk68\") pod \"0e3db829-032c-4a56-9e60-5a332017caca\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.018090 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-ceilometer-tls-certs\") pod \"0e3db829-032c-4a56-9e60-5a332017caca\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.018153 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-run-httpd\") pod \"0e3db829-032c-4a56-9e60-5a332017caca\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.018183 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-log-httpd\") pod \"0e3db829-032c-4a56-9e60-5a332017caca\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.018298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-combined-ca-bundle\") pod \"0e3db829-032c-4a56-9e60-5a332017caca\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.018325 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-config-data\") pod \"0e3db829-032c-4a56-9e60-5a332017caca\" (UID: \"0e3db829-032c-4a56-9e60-5a332017caca\") " Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.018718 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e3db829-032c-4a56-9e60-5a332017caca" (UID: "0e3db829-032c-4a56-9e60-5a332017caca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.018824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e3db829-032c-4a56-9e60-5a332017caca" (UID: "0e3db829-032c-4a56-9e60-5a332017caca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.023900 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e3db829-032c-4a56-9e60-5a332017caca-kube-api-access-vjk68" (OuterVolumeSpecName: "kube-api-access-vjk68") pod "0e3db829-032c-4a56-9e60-5a332017caca" (UID: "0e3db829-032c-4a56-9e60-5a332017caca"). InnerVolumeSpecName "kube-api-access-vjk68". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.031161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-scripts" (OuterVolumeSpecName: "scripts") pod "0e3db829-032c-4a56-9e60-5a332017caca" (UID: "0e3db829-032c-4a56-9e60-5a332017caca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.055030 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e3db829-032c-4a56-9e60-5a332017caca" (UID: "0e3db829-032c-4a56-9e60-5a332017caca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.087436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0e3db829-032c-4a56-9e60-5a332017caca" (UID: "0e3db829-032c-4a56-9e60-5a332017caca"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.120847 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.120873 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.120883 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjk68\" (UniqueName: \"kubernetes.io/projected/0e3db829-032c-4a56-9e60-5a332017caca-kube-api-access-vjk68\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.120895 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.120903 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.120911 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e3db829-032c-4a56-9e60-5a332017caca-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.136772 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e3db829-032c-4a56-9e60-5a332017caca" (UID: "0e3db829-032c-4a56-9e60-5a332017caca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.167471 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-config-data" (OuterVolumeSpecName: "config-data") pod "0e3db829-032c-4a56-9e60-5a332017caca" (UID: "0e3db829-032c-4a56-9e60-5a332017caca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.223043 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.223092 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3db829-032c-4a56-9e60-5a332017caca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.433285 4707 generic.go:334] "Generic (PLEG): container finished" podID="0e3db829-032c-4a56-9e60-5a332017caca" containerID="55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a" exitCode=0 Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.433319 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerDied","Data":"55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a"} Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.433343 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e3db829-032c-4a56-9e60-5a332017caca","Type":"ContainerDied","Data":"20e9c16484deaecf0a42ea847e69aef1e03e1e05d990a18867a56ada17a40ee8"} Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.433359 4707 scope.go:117] "RemoveContainer" containerID="feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.433499 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.464044 4707 scope.go:117] "RemoveContainer" containerID="8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.474554 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.499012 4707 scope.go:117] "RemoveContainer" containerID="55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.501566 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.514432 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:15 crc kubenswrapper[4707]: E1127 16:23:15.514951 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="ceilometer-notification-agent" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.514966 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="ceilometer-notification-agent" Nov 27 16:23:15 crc kubenswrapper[4707]: E1127 16:23:15.514983 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="proxy-httpd" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.514989 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="proxy-httpd" Nov 27 16:23:15 crc kubenswrapper[4707]: E1127 16:23:15.515018 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="ceilometer-central-agent" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.515061 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="ceilometer-central-agent" Nov 27 16:23:15 crc kubenswrapper[4707]: E1127 16:23:15.515086 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="sg-core" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.515091 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="sg-core" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.515273 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="proxy-httpd" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.515289 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="sg-core" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.515300 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="ceilometer-notification-agent" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.515314 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3db829-032c-4a56-9e60-5a332017caca" containerName="ceilometer-central-agent" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.522055 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.526542 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.526629 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.526700 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.526843 4707 scope.go:117] "RemoveContainer" containerID="28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.531513 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.548437 4707 scope.go:117] "RemoveContainer" containerID="feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36" Nov 27 16:23:15 crc kubenswrapper[4707]: E1127 16:23:15.551144 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36\": container with ID starting with feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36 not found: ID does not exist" containerID="feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.551211 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36"} err="failed to get container status \"feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36\": rpc error: code = NotFound desc = could not find container \"feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36\": container with ID starting with feedec96cb5bdd33a7b85e1368f5179496601df842e20d24a395d0f8f25efe36 not found: ID does not exist" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.551250 4707 scope.go:117] "RemoveContainer" containerID="8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae" Nov 27 16:23:15 crc kubenswrapper[4707]: E1127 16:23:15.551902 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae\": container with ID starting with 8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae not found: ID does not exist" containerID="8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.551933 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae"} err="failed to get container status \"8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae\": rpc error: code = NotFound desc = could not find container \"8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae\": container with ID starting with 8c72ad460900530242585d00f443a8e0435f503ecd19c89e3b5d7ffeca2d08ae not found: ID does not exist" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.551960 4707 scope.go:117] "RemoveContainer" containerID="55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a" Nov 27 16:23:15 crc kubenswrapper[4707]: E1127 16:23:15.552347 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a\": container with ID starting with 55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a not found: ID does not exist" containerID="55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.552448 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a"} err="failed to get container status \"55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a\": rpc error: code = NotFound desc = could not find container \"55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a\": container with ID starting with 55d37a8b3b4be70ef636c65913b4849cb19adb2c17d55c0a42e840cea7e1945a not found: ID does not exist" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.552476 4707 scope.go:117] "RemoveContainer" containerID="28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e" Nov 27 16:23:15 crc kubenswrapper[4707]: E1127 16:23:15.552863 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e\": container with ID starting with 28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e not found: ID does not exist" containerID="28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.552903 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e"} err="failed to get container status \"28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e\": rpc error: code = NotFound desc = could not find container \"28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e\": container with ID starting with 28634932b66e265c49c1508d9a23e640e66c7a7ab04ca772f655fccd3d803a0e not found: ID does not exist" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.630876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-run-httpd\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.630938 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjpp\" (UniqueName: \"kubernetes.io/projected/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-kube-api-access-tcjpp\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.631006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-log-httpd\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.631029 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-config-data\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.631048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.631085 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-scripts\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.631116 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.631144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.732715 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-log-httpd\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.733025 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-config-data\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.733166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.733335 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-scripts\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.733555 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.733724 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.734022 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-run-httpd\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.734197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjpp\" (UniqueName: \"kubernetes.io/projected/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-kube-api-access-tcjpp\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.735588 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-log-httpd\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.738845 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-run-httpd\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.743937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-config-data\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.745592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.747398 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.751179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.752562 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-scripts\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.764473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjpp\" (UniqueName: \"kubernetes.io/projected/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-kube-api-access-tcjpp\") pod \"ceilometer-0\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " pod="openstack/ceilometer-0" Nov 27 16:23:15 crc kubenswrapper[4707]: I1127 16:23:15.846364 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:16 crc kubenswrapper[4707]: I1127 16:23:16.345049 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:16 crc kubenswrapper[4707]: W1127 16:23:16.362790 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55ac6b2b_7b17_4ce7_9a82_1253cd1523df.slice/crio-fc6e2c4a906333f2ec18bdcd931381208be3e10a2a60586783dcaa76a59fef93 WatchSource:0}: Error finding container fc6e2c4a906333f2ec18bdcd931381208be3e10a2a60586783dcaa76a59fef93: Status 404 returned error can't find the container with id fc6e2c4a906333f2ec18bdcd931381208be3e10a2a60586783dcaa76a59fef93 Nov 27 16:23:16 crc kubenswrapper[4707]: E1127 16:23:16.383546 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc280ae8f_cb0b_4035_a7a2_fedfb044a6f0.slice/crio-54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566\": RecentStats: unable to find data in memory cache]" Nov 27 16:23:16 crc kubenswrapper[4707]: I1127 16:23:16.404823 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:16 crc kubenswrapper[4707]: I1127 16:23:16.443843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerStarted","Data":"fc6e2c4a906333f2ec18bdcd931381208be3e10a2a60586783dcaa76a59fef93"} Nov 27 16:23:16 crc kubenswrapper[4707]: I1127 16:23:16.991635 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.021463 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.219511 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e3db829-032c-4a56-9e60-5a332017caca" path="/var/lib/kubelet/pods/0e3db829-032c-4a56-9e60-5a332017caca/volumes" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.464627 4707 generic.go:334] "Generic (PLEG): container finished" podID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerID="bc2e266c1a533e2d150cd20661defbc1ddcfc750d68590d297dea9099a539d06" exitCode=0 Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.464744 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7f1272cc-2e78-4c7e-abf5-56c30b17a717","Type":"ContainerDied","Data":"bc2e266c1a533e2d150cd20661defbc1ddcfc750d68590d297dea9099a539d06"} Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.466877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerStarted","Data":"3260a1516b2d39f6339e628cac19f9e356aa36ae0cc8d1b10858c63af8c8c830"} Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.493655 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.646359 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-crzc9"] Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.647590 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.651883 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.653724 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.683502 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-crzc9"] Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.781270 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.781671 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwvt\" (UniqueName: \"kubernetes.io/projected/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-kube-api-access-6xwvt\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.781947 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-scripts\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.782098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-config-data\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.788763 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884126 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-config-data\") pod \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-combined-ca-bundle\") pod \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1272cc-2e78-4c7e-abf5-56c30b17a717-logs\") pod \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4m62\" (UniqueName: \"kubernetes.io/projected/7f1272cc-2e78-4c7e-abf5-56c30b17a717-kube-api-access-d4m62\") pod \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\" (UID: \"7f1272cc-2e78-4c7e-abf5-56c30b17a717\") " Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884591 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwvt\" (UniqueName: \"kubernetes.io/projected/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-kube-api-access-6xwvt\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884660 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-scripts\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-config-data\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.884926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f1272cc-2e78-4c7e-abf5-56c30b17a717-logs" (OuterVolumeSpecName: "logs") pod "7f1272cc-2e78-4c7e-abf5-56c30b17a717" (UID: "7f1272cc-2e78-4c7e-abf5-56c30b17a717"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.889773 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-scripts\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.892520 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1272cc-2e78-4c7e-abf5-56c30b17a717-kube-api-access-d4m62" (OuterVolumeSpecName: "kube-api-access-d4m62") pod "7f1272cc-2e78-4c7e-abf5-56c30b17a717" (UID: "7f1272cc-2e78-4c7e-abf5-56c30b17a717"). InnerVolumeSpecName "kube-api-access-d4m62". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.893455 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.903306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwvt\" (UniqueName: \"kubernetes.io/projected/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-kube-api-access-6xwvt\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.910050 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-config-data\") pod \"nova-cell1-cell-mapping-crzc9\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.921259 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-config-data" (OuterVolumeSpecName: "config-data") pod "7f1272cc-2e78-4c7e-abf5-56c30b17a717" (UID: "7f1272cc-2e78-4c7e-abf5-56c30b17a717"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.936168 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f1272cc-2e78-4c7e-abf5-56c30b17a717" (UID: "7f1272cc-2e78-4c7e-abf5-56c30b17a717"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.978634 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.986346 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.986389 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1272cc-2e78-4c7e-abf5-56c30b17a717-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.986422 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1272cc-2e78-4c7e-abf5-56c30b17a717-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:17 crc kubenswrapper[4707]: I1127 16:23:17.986430 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4m62\" (UniqueName: \"kubernetes.io/projected/7f1272cc-2e78-4c7e-abf5-56c30b17a717-kube-api-access-d4m62\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.480406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7f1272cc-2e78-4c7e-abf5-56c30b17a717","Type":"ContainerDied","Data":"09b28ea7ce0f884e6de418cfaa5107d73c243444334608157817c55dd1388df9"} Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.480706 4707 scope.go:117] "RemoveContainer" containerID="bc2e266c1a533e2d150cd20661defbc1ddcfc750d68590d297dea9099a539d06" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.480451 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.565450 4707 scope.go:117] "RemoveContainer" containerID="8c80e27b4be2b856c64ad038bb4e813389a32360df4a1f008482ea68bf8b70ab" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.584059 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.606787 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.633471 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:18 crc kubenswrapper[4707]: E1127 16:23:18.633940 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-api" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.633957 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-api" Nov 27 16:23:18 crc kubenswrapper[4707]: E1127 16:23:18.633994 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-log" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.633999 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-log" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.634179 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-api" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.634208 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" containerName="nova-api-log" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.635256 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.639676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.639772 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.649576 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.651293 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.661269 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-crzc9"] Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.696876 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-config-data\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.696911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c67f74c7-c75f-4370-980d-c7a9fa5263a0-logs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.697009 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.697186 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8kc\" (UniqueName: \"kubernetes.io/projected/c67f74c7-c75f-4370-980d-c7a9fa5263a0-kube-api-access-jc8kc\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.697245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.697314 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.799166 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.799232 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8kc\" (UniqueName: \"kubernetes.io/projected/c67f74c7-c75f-4370-980d-c7a9fa5263a0-kube-api-access-jc8kc\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.799263 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.799280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.799360 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-config-data\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.799390 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c67f74c7-c75f-4370-980d-c7a9fa5263a0-logs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.800089 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c67f74c7-c75f-4370-980d-c7a9fa5263a0-logs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.804168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-public-tls-certs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.804637 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-config-data\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.805539 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.806102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.822418 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8kc\" (UniqueName: \"kubernetes.io/projected/c67f74c7-c75f-4370-980d-c7a9fa5263a0-kube-api-access-jc8kc\") pod \"nova-api-0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " pod="openstack/nova-api-0" Nov 27 16:23:18 crc kubenswrapper[4707]: I1127 16:23:18.985632 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:23:19 crc kubenswrapper[4707]: I1127 16:23:19.206745 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1272cc-2e78-4c7e-abf5-56c30b17a717" path="/var/lib/kubelet/pods/7f1272cc-2e78-4c7e-abf5-56c30b17a717/volumes" Nov 27 16:23:19 crc kubenswrapper[4707]: I1127 16:23:19.492641 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-crzc9" event={"ID":"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5","Type":"ContainerStarted","Data":"dc692054023f378f46ac0708e97db7997e183cbc95de67455c38d5a3dd56bd73"} Nov 27 16:23:19 crc kubenswrapper[4707]: I1127 16:23:19.492903 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-crzc9" event={"ID":"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5","Type":"ContainerStarted","Data":"df914b4496e73a689cb91efe22c60f009af0ce03d2c01de74e84d1b1ea9aaf2d"} Nov 27 16:23:19 crc kubenswrapper[4707]: I1127 16:23:19.496229 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerStarted","Data":"ba6fca8fdba84e5bc25c3ae5bb2945fad9a8684ce9d8cff9e0a15b2324e700e8"} Nov 27 16:23:19 crc kubenswrapper[4707]: I1127 16:23:19.496266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerStarted","Data":"e59cfe3cc9dbb07684b786fc8b9f05e67dc627c7b667e2784d74eff76f696e06"} Nov 27 16:23:19 crc kubenswrapper[4707]: I1127 16:23:19.500444 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:19 crc kubenswrapper[4707]: I1127 16:23:19.515983 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-crzc9" podStartSLOduration=2.515965939 podStartE2EDuration="2.515965939s" podCreationTimestamp="2025-11-27 16:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:23:19.507906413 +0000 UTC m=+1175.139355191" watchObservedRunningTime="2025-11-27 16:23:19.515965939 +0000 UTC m=+1175.147414707" Nov 27 16:23:20 crc kubenswrapper[4707]: I1127 16:23:20.506426 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c67f74c7-c75f-4370-980d-c7a9fa5263a0","Type":"ContainerStarted","Data":"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d"} Nov 27 16:23:20 crc kubenswrapper[4707]: I1127 16:23:20.506792 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c67f74c7-c75f-4370-980d-c7a9fa5263a0","Type":"ContainerStarted","Data":"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee"} Nov 27 16:23:20 crc kubenswrapper[4707]: I1127 16:23:20.506804 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c67f74c7-c75f-4370-980d-c7a9fa5263a0","Type":"ContainerStarted","Data":"c74dc345e8a50e22f5b1b9ac17c8faa18550c9e0f3f0b83c2be2f215f68f4b11"} Nov 27 16:23:20 crc kubenswrapper[4707]: I1127 16:23:20.528373 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5283424500000002 podStartE2EDuration="2.52834245s" podCreationTimestamp="2025-11-27 16:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:23:20.523568063 +0000 UTC m=+1176.155016831" watchObservedRunningTime="2025-11-27 16:23:20.52834245 +0000 UTC m=+1176.159791228" Nov 27 16:23:20 crc kubenswrapper[4707]: I1127 16:23:20.889298 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:23:20 crc kubenswrapper[4707]: I1127 16:23:20.977975 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-rs6pv"] Nov 27 16:23:20 crc kubenswrapper[4707]: I1127 16:23:20.978227 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" podUID="34748232-abe0-4c63-9fe6-68118b7fed04" containerName="dnsmasq-dns" containerID="cri-o://850ffd2ac4f6fc750a1234dc25593a820b170122351d923e194369828bba3f32" gracePeriod=10 Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.517369 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerStarted","Data":"f74dc715c9edd862d53c5fbd6c19ea4b15a0693812a608c970136f5addad4a2e"} Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.519078 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.519210 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="ceilometer-central-agent" containerID="cri-o://3260a1516b2d39f6339e628cac19f9e356aa36ae0cc8d1b10858c63af8c8c830" gracePeriod=30 Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.519414 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="ceilometer-notification-agent" containerID="cri-o://e59cfe3cc9dbb07684b786fc8b9f05e67dc627c7b667e2784d74eff76f696e06" gracePeriod=30 Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.519422 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="sg-core" containerID="cri-o://ba6fca8fdba84e5bc25c3ae5bb2945fad9a8684ce9d8cff9e0a15b2324e700e8" gracePeriod=30 Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.519699 4707 generic.go:334] "Generic (PLEG): container finished" podID="34748232-abe0-4c63-9fe6-68118b7fed04" containerID="850ffd2ac4f6fc750a1234dc25593a820b170122351d923e194369828bba3f32" exitCode=0 Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.520268 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" event={"ID":"34748232-abe0-4c63-9fe6-68118b7fed04","Type":"ContainerDied","Data":"850ffd2ac4f6fc750a1234dc25593a820b170122351d923e194369828bba3f32"} Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.520409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" event={"ID":"34748232-abe0-4c63-9fe6-68118b7fed04","Type":"ContainerDied","Data":"ad9c371673e96a18190933bc27f03950abea6847d7066cc5ab802b04581e8256"} Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.520508 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9c371673e96a18190933bc27f03950abea6847d7066cc5ab802b04581e8256" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.519701 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="proxy-httpd" containerID="cri-o://f74dc715c9edd862d53c5fbd6c19ea4b15a0693812a608c970136f5addad4a2e" gracePeriod=30 Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.549066 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.564701 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.552403898 podStartE2EDuration="6.564684463s" podCreationTimestamp="2025-11-27 16:23:15 +0000 UTC" firstStartedPulling="2025-11-27 16:23:16.365232363 +0000 UTC m=+1171.996681131" lastFinishedPulling="2025-11-27 16:23:20.377512928 +0000 UTC m=+1176.008961696" observedRunningTime="2025-11-27 16:23:21.558122804 +0000 UTC m=+1177.189571572" watchObservedRunningTime="2025-11-27 16:23:21.564684463 +0000 UTC m=+1177.196133231" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.676534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-svc\") pod \"34748232-abe0-4c63-9fe6-68118b7fed04\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.676623 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-config\") pod \"34748232-abe0-4c63-9fe6-68118b7fed04\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.676726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hbvd\" (UniqueName: \"kubernetes.io/projected/34748232-abe0-4c63-9fe6-68118b7fed04-kube-api-access-7hbvd\") pod \"34748232-abe0-4c63-9fe6-68118b7fed04\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.676779 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-nb\") pod \"34748232-abe0-4c63-9fe6-68118b7fed04\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.676817 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-swift-storage-0\") pod \"34748232-abe0-4c63-9fe6-68118b7fed04\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.676862 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-sb\") pod \"34748232-abe0-4c63-9fe6-68118b7fed04\" (UID: \"34748232-abe0-4c63-9fe6-68118b7fed04\") " Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.698601 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34748232-abe0-4c63-9fe6-68118b7fed04-kube-api-access-7hbvd" (OuterVolumeSpecName: "kube-api-access-7hbvd") pod "34748232-abe0-4c63-9fe6-68118b7fed04" (UID: "34748232-abe0-4c63-9fe6-68118b7fed04"). InnerVolumeSpecName "kube-api-access-7hbvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.733140 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-config" (OuterVolumeSpecName: "config") pod "34748232-abe0-4c63-9fe6-68118b7fed04" (UID: "34748232-abe0-4c63-9fe6-68118b7fed04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.775531 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "34748232-abe0-4c63-9fe6-68118b7fed04" (UID: "34748232-abe0-4c63-9fe6-68118b7fed04"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.779323 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hbvd\" (UniqueName: \"kubernetes.io/projected/34748232-abe0-4c63-9fe6-68118b7fed04-kube-api-access-7hbvd\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.779353 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.779362 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.783195 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34748232-abe0-4c63-9fe6-68118b7fed04" (UID: "34748232-abe0-4c63-9fe6-68118b7fed04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.792846 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34748232-abe0-4c63-9fe6-68118b7fed04" (UID: "34748232-abe0-4c63-9fe6-68118b7fed04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.804841 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34748232-abe0-4c63-9fe6-68118b7fed04" (UID: "34748232-abe0-4c63-9fe6-68118b7fed04"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.881320 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.881359 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:21 crc kubenswrapper[4707]: I1127 16:23:21.881375 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34748232-abe0-4c63-9fe6-68118b7fed04-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.531630 4707 generic.go:334] "Generic (PLEG): container finished" podID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerID="f74dc715c9edd862d53c5fbd6c19ea4b15a0693812a608c970136f5addad4a2e" exitCode=0 Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.531662 4707 generic.go:334] "Generic (PLEG): container finished" podID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerID="ba6fca8fdba84e5bc25c3ae5bb2945fad9a8684ce9d8cff9e0a15b2324e700e8" exitCode=2 Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.531670 4707 generic.go:334] "Generic (PLEG): container finished" podID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerID="e59cfe3cc9dbb07684b786fc8b9f05e67dc627c7b667e2784d74eff76f696e06" exitCode=0 Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.531735 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.531812 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerDied","Data":"f74dc715c9edd862d53c5fbd6c19ea4b15a0693812a608c970136f5addad4a2e"} Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.531870 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerDied","Data":"ba6fca8fdba84e5bc25c3ae5bb2945fad9a8684ce9d8cff9e0a15b2324e700e8"} Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.531892 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerDied","Data":"e59cfe3cc9dbb07684b786fc8b9f05e67dc627c7b667e2784d74eff76f696e06"} Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.565897 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-rs6pv"] Nov 27 16:23:22 crc kubenswrapper[4707]: I1127 16:23:22.575375 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-rs6pv"] Nov 27 16:23:23 crc kubenswrapper[4707]: I1127 16:23:23.213669 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34748232-abe0-4c63-9fe6-68118b7fed04" path="/var/lib/kubelet/pods/34748232-abe0-4c63-9fe6-68118b7fed04/volumes" Nov 27 16:23:23 crc kubenswrapper[4707]: I1127 16:23:23.545165 4707 generic.go:334] "Generic (PLEG): container finished" podID="28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" containerID="dc692054023f378f46ac0708e97db7997e183cbc95de67455c38d5a3dd56bd73" exitCode=0 Nov 27 16:23:23 crc kubenswrapper[4707]: I1127 16:23:23.545278 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-crzc9" event={"ID":"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5","Type":"ContainerDied","Data":"dc692054023f378f46ac0708e97db7997e183cbc95de67455c38d5a3dd56bd73"} Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.561390 4707 generic.go:334] "Generic (PLEG): container finished" podID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerID="3260a1516b2d39f6339e628cac19f9e356aa36ae0cc8d1b10858c63af8c8c830" exitCode=0 Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.561524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerDied","Data":"3260a1516b2d39f6339e628cac19f9e356aa36ae0cc8d1b10858c63af8c8c830"} Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.699872 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.745245 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjpp\" (UniqueName: \"kubernetes.io/projected/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-kube-api-access-tcjpp\") pod \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.745408 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-ceilometer-tls-certs\") pod \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.745496 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-log-httpd\") pod \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.745554 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-scripts\") pod \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.745588 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-run-httpd\") pod \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.745693 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-combined-ca-bundle\") pod \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.745756 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-sg-core-conf-yaml\") pod \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.745842 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-config-data\") pod \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\" (UID: \"55ac6b2b-7b17-4ce7-9a82-1253cd1523df\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.746193 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "55ac6b2b-7b17-4ce7-9a82-1253cd1523df" (UID: "55ac6b2b-7b17-4ce7-9a82-1253cd1523df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.746420 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "55ac6b2b-7b17-4ce7-9a82-1253cd1523df" (UID: "55ac6b2b-7b17-4ce7-9a82-1253cd1523df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.747242 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.747279 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.756876 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-scripts" (OuterVolumeSpecName: "scripts") pod "55ac6b2b-7b17-4ce7-9a82-1253cd1523df" (UID: "55ac6b2b-7b17-4ce7-9a82-1253cd1523df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.757145 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-kube-api-access-tcjpp" (OuterVolumeSpecName: "kube-api-access-tcjpp") pod "55ac6b2b-7b17-4ce7-9a82-1253cd1523df" (UID: "55ac6b2b-7b17-4ce7-9a82-1253cd1523df"). InnerVolumeSpecName "kube-api-access-tcjpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.773169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "55ac6b2b-7b17-4ce7-9a82-1253cd1523df" (UID: "55ac6b2b-7b17-4ce7-9a82-1253cd1523df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.817047 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "55ac6b2b-7b17-4ce7-9a82-1253cd1523df" (UID: "55ac6b2b-7b17-4ce7-9a82-1253cd1523df"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.851003 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjpp\" (UniqueName: \"kubernetes.io/projected/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-kube-api-access-tcjpp\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.851051 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.851076 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.851085 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.858177 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55ac6b2b-7b17-4ce7-9a82-1253cd1523df" (UID: "55ac6b2b-7b17-4ce7-9a82-1253cd1523df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.871857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-config-data" (OuterVolumeSpecName: "config-data") pod "55ac6b2b-7b17-4ce7-9a82-1253cd1523df" (UID: "55ac6b2b-7b17-4ce7-9a82-1253cd1523df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.923270 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.951875 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-combined-ca-bundle\") pod \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.951951 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-config-data\") pod \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.952035 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-scripts\") pod \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.952071 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xwvt\" (UniqueName: \"kubernetes.io/projected/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-kube-api-access-6xwvt\") pod \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\" (UID: \"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5\") " Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.952502 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.952521 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ac6b2b-7b17-4ce7-9a82-1253cd1523df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.955709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-scripts" (OuterVolumeSpecName: "scripts") pod "28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" (UID: "28d07a58-b515-4c7b-a75e-ce5d89a9c9f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.955889 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-kube-api-access-6xwvt" (OuterVolumeSpecName: "kube-api-access-6xwvt") pod "28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" (UID: "28d07a58-b515-4c7b-a75e-ce5d89a9c9f5"). InnerVolumeSpecName "kube-api-access-6xwvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.976317 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-config-data" (OuterVolumeSpecName: "config-data") pod "28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" (UID: "28d07a58-b515-4c7b-a75e-ce5d89a9c9f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:24 crc kubenswrapper[4707]: I1127 16:23:24.980857 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" (UID: "28d07a58-b515-4c7b-a75e-ce5d89a9c9f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.053375 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.053422 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.053431 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.053441 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xwvt\" (UniqueName: \"kubernetes.io/projected/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5-kube-api-access-6xwvt\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.571593 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"55ac6b2b-7b17-4ce7-9a82-1253cd1523df","Type":"ContainerDied","Data":"fc6e2c4a906333f2ec18bdcd931381208be3e10a2a60586783dcaa76a59fef93"} Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.571635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.571870 4707 scope.go:117] "RemoveContainer" containerID="f74dc715c9edd862d53c5fbd6c19ea4b15a0693812a608c970136f5addad4a2e" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.577875 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-crzc9" event={"ID":"28d07a58-b515-4c7b-a75e-ce5d89a9c9f5","Type":"ContainerDied","Data":"df914b4496e73a689cb91efe22c60f009af0ce03d2c01de74e84d1b1ea9aaf2d"} Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.577912 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df914b4496e73a689cb91efe22c60f009af0ce03d2c01de74e84d1b1ea9aaf2d" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.578079 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-crzc9" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.600671 4707 scope.go:117] "RemoveContainer" containerID="ba6fca8fdba84e5bc25c3ae5bb2945fad9a8684ce9d8cff9e0a15b2324e700e8" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.635721 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.637479 4707 scope.go:117] "RemoveContainer" containerID="e59cfe3cc9dbb07684b786fc8b9f05e67dc627c7b667e2784d74eff76f696e06" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.644033 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.657541 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:25 crc kubenswrapper[4707]: E1127 16:23:25.657969 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="sg-core" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.657985 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="sg-core" Nov 27 16:23:25 crc kubenswrapper[4707]: E1127 16:23:25.658000 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" containerName="nova-manage" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658009 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" containerName="nova-manage" Nov 27 16:23:25 crc kubenswrapper[4707]: E1127 16:23:25.658052 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34748232-abe0-4c63-9fe6-68118b7fed04" containerName="dnsmasq-dns" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658060 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="34748232-abe0-4c63-9fe6-68118b7fed04" containerName="dnsmasq-dns" Nov 27 16:23:25 crc kubenswrapper[4707]: E1127 16:23:25.658074 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="ceilometer-notification-agent" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658080 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="ceilometer-notification-agent" Nov 27 16:23:25 crc kubenswrapper[4707]: E1127 16:23:25.658092 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34748232-abe0-4c63-9fe6-68118b7fed04" containerName="init" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658098 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="34748232-abe0-4c63-9fe6-68118b7fed04" containerName="init" Nov 27 16:23:25 crc kubenswrapper[4707]: E1127 16:23:25.658114 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="ceilometer-central-agent" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658119 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="ceilometer-central-agent" Nov 27 16:23:25 crc kubenswrapper[4707]: E1127 16:23:25.658133 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="proxy-httpd" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658139 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="proxy-httpd" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658302 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="sg-core" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658317 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="34748232-abe0-4c63-9fe6-68118b7fed04" containerName="dnsmasq-dns" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658332 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="proxy-httpd" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658347 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="ceilometer-notification-agent" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658359 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" containerName="nova-manage" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.658372 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" containerName="ceilometer-central-agent" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.660095 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.663743 4707 scope.go:117] "RemoveContainer" containerID="3260a1516b2d39f6339e628cac19f9e356aa36ae0cc8d1b10858c63af8c8c830" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.664039 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.664211 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.664326 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.672497 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.730337 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.730585 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a431ee7a-2703-4dea-bba3-2bbdf489e218" containerName="nova-scheduler-scheduler" containerID="cri-o://196432ce75d91f6078033019944cd5cc7c22ba2600ee9338c8e07f65f6237ca8" gracePeriod=30 Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.740417 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.740666 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerName="nova-api-log" containerID="cri-o://df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee" gracePeriod=30 Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.740806 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerName="nova-api-api" containerID="cri-o://52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d" gracePeriod=30 Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.752315 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.752575 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-log" containerID="cri-o://ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16" gracePeriod=30 Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.752666 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-metadata" containerID="cri-o://238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627" gracePeriod=30 Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.765713 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.765762 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-config-data\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.765804 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-log-httpd\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.765853 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbgm5\" (UniqueName: \"kubernetes.io/projected/2aeefda1-503b-4869-b318-d9ebfa19337c-kube-api-access-cbgm5\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.765875 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.765909 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-run-httpd\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.765928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-scripts\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.765944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.867315 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.867356 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-config-data\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.867411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-log-httpd\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.867457 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbgm5\" (UniqueName: \"kubernetes.io/projected/2aeefda1-503b-4869-b318-d9ebfa19337c-kube-api-access-cbgm5\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.867481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.867513 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-run-httpd\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.867534 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-scripts\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.867548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.868272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-run-httpd\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.868356 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-log-httpd\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.872068 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.874033 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-scripts\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.874492 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-config-data\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.874884 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.875204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.884534 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbgm5\" (UniqueName: \"kubernetes.io/projected/2aeefda1-503b-4869-b318-d9ebfa19337c-kube-api-access-cbgm5\") pod \"ceilometer-0\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " pod="openstack/ceilometer-0" Nov 27 16:23:25 crc kubenswrapper[4707]: I1127 16:23:25.990348 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.431797 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.472887 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:23:26 crc kubenswrapper[4707]: W1127 16:23:26.473791 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aeefda1_503b_4869_b318_d9ebfa19337c.slice/crio-15c526768386dd29c83ab3149e0a28089add685e19268ec22077d2a4408a93db WatchSource:0}: Error finding container 15c526768386dd29c83ab3149e0a28089add685e19268ec22077d2a4408a93db: Status 404 returned error can't find the container with id 15c526768386dd29c83ab3149e0a28089add685e19268ec22077d2a4408a93db Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.489330 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9b86998b5-rs6pv" podUID="34748232-abe0-4c63-9fe6-68118b7fed04" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.194:5353: i/o timeout" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.584634 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-combined-ca-bundle\") pod \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.584677 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-internal-tls-certs\") pod \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.584750 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc8kc\" (UniqueName: \"kubernetes.io/projected/c67f74c7-c75f-4370-980d-c7a9fa5263a0-kube-api-access-jc8kc\") pod \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.584787 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-config-data\") pod \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.584810 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-public-tls-certs\") pod \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.584870 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c67f74c7-c75f-4370-980d-c7a9fa5263a0-logs\") pod \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\" (UID: \"c67f74c7-c75f-4370-980d-c7a9fa5263a0\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.586129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67f74c7-c75f-4370-980d-c7a9fa5263a0-logs" (OuterVolumeSpecName: "logs") pod "c67f74c7-c75f-4370-980d-c7a9fa5263a0" (UID: "c67f74c7-c75f-4370-980d-c7a9fa5263a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.590524 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c67f74c7-c75f-4370-980d-c7a9fa5263a0-kube-api-access-jc8kc" (OuterVolumeSpecName: "kube-api-access-jc8kc") pod "c67f74c7-c75f-4370-980d-c7a9fa5263a0" (UID: "c67f74c7-c75f-4370-980d-c7a9fa5263a0"). InnerVolumeSpecName "kube-api-access-jc8kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.593361 4707 generic.go:334] "Generic (PLEG): container finished" podID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerID="52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d" exitCode=0 Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.593430 4707 generic.go:334] "Generic (PLEG): container finished" podID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerID="df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee" exitCode=143 Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.593434 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.593489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c67f74c7-c75f-4370-980d-c7a9fa5263a0","Type":"ContainerDied","Data":"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d"} Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.593519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c67f74c7-c75f-4370-980d-c7a9fa5263a0","Type":"ContainerDied","Data":"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee"} Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.594211 4707 scope.go:117] "RemoveContainer" containerID="52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.598303 4707 generic.go:334] "Generic (PLEG): container finished" podID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerID="ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16" exitCode=143 Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.600731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c67f74c7-c75f-4370-980d-c7a9fa5263a0","Type":"ContainerDied","Data":"c74dc345e8a50e22f5b1b9ac17c8faa18550c9e0f3f0b83c2be2f215f68f4b11"} Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.600808 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerStarted","Data":"15c526768386dd29c83ab3149e0a28089add685e19268ec22077d2a4408a93db"} Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.600825 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbc32b0f-9bae-4aed-a868-6b06465b570f","Type":"ContainerDied","Data":"ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16"} Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.617494 4707 generic.go:334] "Generic (PLEG): container finished" podID="a431ee7a-2703-4dea-bba3-2bbdf489e218" containerID="196432ce75d91f6078033019944cd5cc7c22ba2600ee9338c8e07f65f6237ca8" exitCode=0 Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.617535 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a431ee7a-2703-4dea-bba3-2bbdf489e218","Type":"ContainerDied","Data":"196432ce75d91f6078033019944cd5cc7c22ba2600ee9338c8e07f65f6237ca8"} Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.648235 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c67f74c7-c75f-4370-980d-c7a9fa5263a0" (UID: "c67f74c7-c75f-4370-980d-c7a9fa5263a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.663061 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-config-data" (OuterVolumeSpecName: "config-data") pod "c67f74c7-c75f-4370-980d-c7a9fa5263a0" (UID: "c67f74c7-c75f-4370-980d-c7a9fa5263a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.671945 4707 scope.go:117] "RemoveContainer" containerID="df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.672809 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.679284 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c67f74c7-c75f-4370-980d-c7a9fa5263a0" (UID: "c67f74c7-c75f-4370-980d-c7a9fa5263a0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.690275 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.690300 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc8kc\" (UniqueName: \"kubernetes.io/projected/c67f74c7-c75f-4370-980d-c7a9fa5263a0-kube-api-access-jc8kc\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.690311 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.690319 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.690329 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c67f74c7-c75f-4370-980d-c7a9fa5263a0-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.692452 4707 scope.go:117] "RemoveContainer" containerID="52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d" Nov 27 16:23:26 crc kubenswrapper[4707]: E1127 16:23:26.693390 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc280ae8f_cb0b_4035_a7a2_fedfb044a6f0.slice/crio-54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566\": RecentStats: unable to find data in memory cache]" Nov 27 16:23:26 crc kubenswrapper[4707]: E1127 16:23:26.695568 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d\": container with ID starting with 52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d not found: ID does not exist" containerID="52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.695596 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d"} err="failed to get container status \"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d\": rpc error: code = NotFound desc = could not find container \"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d\": container with ID starting with 52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d not found: ID does not exist" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.695618 4707 scope.go:117] "RemoveContainer" containerID="df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee" Nov 27 16:23:26 crc kubenswrapper[4707]: E1127 16:23:26.695864 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee\": container with ID starting with df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee not found: ID does not exist" containerID="df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.695885 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee"} err="failed to get container status \"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee\": rpc error: code = NotFound desc = could not find container \"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee\": container with ID starting with df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee not found: ID does not exist" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.695987 4707 scope.go:117] "RemoveContainer" containerID="52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.696178 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d"} err="failed to get container status \"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d\": rpc error: code = NotFound desc = could not find container \"52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d\": container with ID starting with 52f19ee1283e22c6e4a0af879d164b933b757202bfd96e313b8a164df908264d not found: ID does not exist" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.696193 4707 scope.go:117] "RemoveContainer" containerID="df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.696339 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee"} err="failed to get container status \"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee\": rpc error: code = NotFound desc = could not find container \"df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee\": container with ID starting with df2f9874f7a9767a97ccc647e045387267e621a409db29dfa534e2eba70ee0ee not found: ID does not exist" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.714565 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c67f74c7-c75f-4370-980d-c7a9fa5263a0" (UID: "c67f74c7-c75f-4370-980d-c7a9fa5263a0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.791856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-combined-ca-bundle\") pod \"a431ee7a-2703-4dea-bba3-2bbdf489e218\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.791970 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckbnd\" (UniqueName: \"kubernetes.io/projected/a431ee7a-2703-4dea-bba3-2bbdf489e218-kube-api-access-ckbnd\") pod \"a431ee7a-2703-4dea-bba3-2bbdf489e218\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.791999 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-config-data\") pod \"a431ee7a-2703-4dea-bba3-2bbdf489e218\" (UID: \"a431ee7a-2703-4dea-bba3-2bbdf489e218\") " Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.792321 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c67f74c7-c75f-4370-980d-c7a9fa5263a0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.796906 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a431ee7a-2703-4dea-bba3-2bbdf489e218-kube-api-access-ckbnd" (OuterVolumeSpecName: "kube-api-access-ckbnd") pod "a431ee7a-2703-4dea-bba3-2bbdf489e218" (UID: "a431ee7a-2703-4dea-bba3-2bbdf489e218"). InnerVolumeSpecName "kube-api-access-ckbnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.814606 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a431ee7a-2703-4dea-bba3-2bbdf489e218" (UID: "a431ee7a-2703-4dea-bba3-2bbdf489e218"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.817490 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-config-data" (OuterVolumeSpecName: "config-data") pod "a431ee7a-2703-4dea-bba3-2bbdf489e218" (UID: "a431ee7a-2703-4dea-bba3-2bbdf489e218"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.894120 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.894149 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckbnd\" (UniqueName: \"kubernetes.io/projected/a431ee7a-2703-4dea-bba3-2bbdf489e218-kube-api-access-ckbnd\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.894161 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a431ee7a-2703-4dea-bba3-2bbdf489e218-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.923922 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.937258 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.961062 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:26 crc kubenswrapper[4707]: E1127 16:23:26.961697 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a431ee7a-2703-4dea-bba3-2bbdf489e218" containerName="nova-scheduler-scheduler" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.961764 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a431ee7a-2703-4dea-bba3-2bbdf489e218" containerName="nova-scheduler-scheduler" Nov 27 16:23:26 crc kubenswrapper[4707]: E1127 16:23:26.961814 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerName="nova-api-api" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.961871 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerName="nova-api-api" Nov 27 16:23:26 crc kubenswrapper[4707]: E1127 16:23:26.961932 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerName="nova-api-log" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.961987 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerName="nova-api-log" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.962224 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerName="nova-api-api" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.962290 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" containerName="nova-api-log" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.962349 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a431ee7a-2703-4dea-bba3-2bbdf489e218" containerName="nova-scheduler-scheduler" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.963388 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.965835 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.966324 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.966461 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 27 16:23:26 crc kubenswrapper[4707]: I1127 16:23:26.993024 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.007644 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.007756 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.007873 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4nl2\" (UniqueName: \"kubernetes.io/projected/0e31f3ae-2d72-4ba4-bf44-660b172c5066-kube-api-access-k4nl2\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.007917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e31f3ae-2d72-4ba4-bf44-660b172c5066-logs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.007983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.008007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-config-data\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.109809 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4nl2\" (UniqueName: \"kubernetes.io/projected/0e31f3ae-2d72-4ba4-bf44-660b172c5066-kube-api-access-k4nl2\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.109865 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e31f3ae-2d72-4ba4-bf44-660b172c5066-logs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.109921 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.109940 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-config-data\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.109960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.110006 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.111119 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e31f3ae-2d72-4ba4-bf44-660b172c5066-logs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.113353 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.114074 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.114166 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-config-data\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.114246 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e31f3ae-2d72-4ba4-bf44-660b172c5066-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.127980 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4nl2\" (UniqueName: \"kubernetes.io/projected/0e31f3ae-2d72-4ba4-bf44-660b172c5066-kube-api-access-k4nl2\") pod \"nova-api-0\" (UID: \"0e31f3ae-2d72-4ba4-bf44-660b172c5066\") " pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.206161 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ac6b2b-7b17-4ce7-9a82-1253cd1523df" path="/var/lib/kubelet/pods/55ac6b2b-7b17-4ce7-9a82-1253cd1523df/volumes" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.207069 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c67f74c7-c75f-4370-980d-c7a9fa5263a0" path="/var/lib/kubelet/pods/c67f74c7-c75f-4370-980d-c7a9fa5263a0/volumes" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.285184 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.629510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a431ee7a-2703-4dea-bba3-2bbdf489e218","Type":"ContainerDied","Data":"3daa6645f127bac9aec43e53c32b26e69749a63b7eae898b08ad144eeadf16a8"} Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.629522 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.629821 4707 scope.go:117] "RemoveContainer" containerID="196432ce75d91f6078033019944cd5cc7c22ba2600ee9338c8e07f65f6237ca8" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.632772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerStarted","Data":"77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154"} Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.664846 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.670901 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.681862 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.683081 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.685290 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.690253 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.726765 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fc6\" (UniqueName: \"kubernetes.io/projected/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-kube-api-access-27fc6\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.726838 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.726917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-config-data\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.754001 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 16:23:27 crc kubenswrapper[4707]: W1127 16:23:27.767067 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e31f3ae_2d72_4ba4_bf44_660b172c5066.slice/crio-f1c50df97b156e1df72d2e763216a4feedb99c36642e0a9b338b7072c0278a01 WatchSource:0}: Error finding container f1c50df97b156e1df72d2e763216a4feedb99c36642e0a9b338b7072c0278a01: Status 404 returned error can't find the container with id f1c50df97b156e1df72d2e763216a4feedb99c36642e0a9b338b7072c0278a01 Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.828615 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fc6\" (UniqueName: \"kubernetes.io/projected/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-kube-api-access-27fc6\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.829019 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.829075 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-config-data\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.833556 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-config-data\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.835643 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.845282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fc6\" (UniqueName: \"kubernetes.io/projected/b2d8aab4-7e14-47dd-83ad-80e0272c12cc-kube-api-access-27fc6\") pod \"nova-scheduler-0\" (UID: \"b2d8aab4-7e14-47dd-83ad-80e0272c12cc\") " pod="openstack/nova-scheduler-0" Nov 27 16:23:27 crc kubenswrapper[4707]: I1127 16:23:27.998931 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.497687 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 16:23:28 crc kubenswrapper[4707]: W1127 16:23:28.525097 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2d8aab4_7e14_47dd_83ad_80e0272c12cc.slice/crio-6b03a6fd65ae92685bd28d42d13bea54234127d1d797215603bbcf9d5db67a5f WatchSource:0}: Error finding container 6b03a6fd65ae92685bd28d42d13bea54234127d1d797215603bbcf9d5db67a5f: Status 404 returned error can't find the container with id 6b03a6fd65ae92685bd28d42d13bea54234127d1d797215603bbcf9d5db67a5f Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.647053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2d8aab4-7e14-47dd-83ad-80e0272c12cc","Type":"ContainerStarted","Data":"6b03a6fd65ae92685bd28d42d13bea54234127d1d797215603bbcf9d5db67a5f"} Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.649637 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e31f3ae-2d72-4ba4-bf44-660b172c5066","Type":"ContainerStarted","Data":"a710e412b8f08c5093455ba004942dea0b93e41a96371c23643130c7356a9cb0"} Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.649685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e31f3ae-2d72-4ba4-bf44-660b172c5066","Type":"ContainerStarted","Data":"b2164a18f8d8da7dd7e5d3195d9c8aa32c9fa59fdc6a5428666f3c2962c7f0f4"} Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.649695 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e31f3ae-2d72-4ba4-bf44-660b172c5066","Type":"ContainerStarted","Data":"f1c50df97b156e1df72d2e763216a4feedb99c36642e0a9b338b7072c0278a01"} Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.652729 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerStarted","Data":"f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd"} Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.667636 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.667620062 podStartE2EDuration="2.667620062s" podCreationTimestamp="2025-11-27 16:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:23:28.664208419 +0000 UTC m=+1184.295657197" watchObservedRunningTime="2025-11-27 16:23:28.667620062 +0000 UTC m=+1184.299068830" Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.908015 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": read tcp 10.217.0.2:46636->10.217.0.198:8775: read: connection reset by peer" Nov 27 16:23:28 crc kubenswrapper[4707]: I1127 16:23:28.908146 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": read tcp 10.217.0.2:46622->10.217.0.198:8775: read: connection reset by peer" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.207267 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a431ee7a-2703-4dea-bba3-2bbdf489e218" path="/var/lib/kubelet/pods/a431ee7a-2703-4dea-bba3-2bbdf489e218/volumes" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.375119 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.564977 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-config-data\") pod \"cbc32b0f-9bae-4aed-a868-6b06465b570f\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.565018 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-nova-metadata-tls-certs\") pod \"cbc32b0f-9bae-4aed-a868-6b06465b570f\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.565083 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc32b0f-9bae-4aed-a868-6b06465b570f-logs\") pod \"cbc32b0f-9bae-4aed-a868-6b06465b570f\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.565134 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-combined-ca-bundle\") pod \"cbc32b0f-9bae-4aed-a868-6b06465b570f\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.565285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82dwt\" (UniqueName: \"kubernetes.io/projected/cbc32b0f-9bae-4aed-a868-6b06465b570f-kube-api-access-82dwt\") pod \"cbc32b0f-9bae-4aed-a868-6b06465b570f\" (UID: \"cbc32b0f-9bae-4aed-a868-6b06465b570f\") " Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.565885 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc32b0f-9bae-4aed-a868-6b06465b570f-logs" (OuterVolumeSpecName: "logs") pod "cbc32b0f-9bae-4aed-a868-6b06465b570f" (UID: "cbc32b0f-9bae-4aed-a868-6b06465b570f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.566287 4707 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc32b0f-9bae-4aed-a868-6b06465b570f-logs\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.573532 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc32b0f-9bae-4aed-a868-6b06465b570f-kube-api-access-82dwt" (OuterVolumeSpecName: "kube-api-access-82dwt") pod "cbc32b0f-9bae-4aed-a868-6b06465b570f" (UID: "cbc32b0f-9bae-4aed-a868-6b06465b570f"). InnerVolumeSpecName "kube-api-access-82dwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.603610 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbc32b0f-9bae-4aed-a868-6b06465b570f" (UID: "cbc32b0f-9bae-4aed-a868-6b06465b570f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.620041 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-config-data" (OuterVolumeSpecName: "config-data") pod "cbc32b0f-9bae-4aed-a868-6b06465b570f" (UID: "cbc32b0f-9bae-4aed-a868-6b06465b570f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.631464 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cbc32b0f-9bae-4aed-a868-6b06465b570f" (UID: "cbc32b0f-9bae-4aed-a868-6b06465b570f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.661440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2d8aab4-7e14-47dd-83ad-80e0272c12cc","Type":"ContainerStarted","Data":"cf02cb59accaf219785a2fda83af8483f870e06073a4384479c0fb7d2fb164de"} Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.663840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerStarted","Data":"e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be"} Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.665433 4707 generic.go:334] "Generic (PLEG): container finished" podID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerID="238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627" exitCode=0 Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.665470 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.665492 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbc32b0f-9bae-4aed-a868-6b06465b570f","Type":"ContainerDied","Data":"238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627"} Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.665530 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cbc32b0f-9bae-4aed-a868-6b06465b570f","Type":"ContainerDied","Data":"44a022643a2f8e0df222056e29419364aaf4426f5259df405a9e259e2b841c62"} Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.665549 4707 scope.go:117] "RemoveContainer" containerID="238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.667488 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.667527 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.667539 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82dwt\" (UniqueName: \"kubernetes.io/projected/cbc32b0f-9bae-4aed-a868-6b06465b570f-kube-api-access-82dwt\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.667549 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc32b0f-9bae-4aed-a868-6b06465b570f-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.688062 4707 scope.go:117] "RemoveContainer" containerID="ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.691322 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.691302136 podStartE2EDuration="2.691302136s" podCreationTimestamp="2025-11-27 16:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:23:29.679432558 +0000 UTC m=+1185.310881326" watchObservedRunningTime="2025-11-27 16:23:29.691302136 +0000 UTC m=+1185.322750904" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.705446 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.733146 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.739003 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:23:29 crc kubenswrapper[4707]: E1127 16:23:29.739598 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-metadata" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.739677 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-metadata" Nov 27 16:23:29 crc kubenswrapper[4707]: E1127 16:23:29.739770 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-log" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.739822 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-log" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.740071 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-metadata" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.740149 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" containerName="nova-metadata-log" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.741146 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.744711 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.747624 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.747789 4707 scope.go:117] "RemoveContainer" containerID="238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.748157 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:23:29 crc kubenswrapper[4707]: E1127 16:23:29.759027 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627\": container with ID starting with 238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627 not found: ID does not exist" containerID="238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.759058 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627"} err="failed to get container status \"238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627\": rpc error: code = NotFound desc = could not find container \"238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627\": container with ID starting with 238d8c5feee8c378354af7a1316f4d24c030ec1710cac648345decf354baf627 not found: ID does not exist" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.759079 4707 scope.go:117] "RemoveContainer" containerID="ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16" Nov 27 16:23:29 crc kubenswrapper[4707]: E1127 16:23:29.759956 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16\": container with ID starting with ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16 not found: ID does not exist" containerID="ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.759981 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16"} err="failed to get container status \"ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16\": rpc error: code = NotFound desc = could not find container \"ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16\": container with ID starting with ee5f9d9897f7baa3856bc1b61fe57c15b00c56dd798cf39127b8baa5a6cfcf16 not found: ID does not exist" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.870534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g84f\" (UniqueName: \"kubernetes.io/projected/3ee5c40c-527c-45b7-af82-93d55d4709c9-kube-api-access-2g84f\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.870615 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee5c40c-527c-45b7-af82-93d55d4709c9-logs\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.870986 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.871055 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-config-data\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.871174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.972756 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.972859 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g84f\" (UniqueName: \"kubernetes.io/projected/3ee5c40c-527c-45b7-af82-93d55d4709c9-kube-api-access-2g84f\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.972910 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee5c40c-527c-45b7-af82-93d55d4709c9-logs\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.972955 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.972973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-config-data\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.973839 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ee5c40c-527c-45b7-af82-93d55d4709c9-logs\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.975990 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-config-data\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.977272 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.978473 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ee5c40c-527c-45b7-af82-93d55d4709c9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:29 crc kubenswrapper[4707]: I1127 16:23:29.988336 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g84f\" (UniqueName: \"kubernetes.io/projected/3ee5c40c-527c-45b7-af82-93d55d4709c9-kube-api-access-2g84f\") pod \"nova-metadata-0\" (UID: \"3ee5c40c-527c-45b7-af82-93d55d4709c9\") " pod="openstack/nova-metadata-0" Nov 27 16:23:30 crc kubenswrapper[4707]: I1127 16:23:30.065927 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 16:23:30 crc kubenswrapper[4707]: I1127 16:23:30.540436 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 16:23:30 crc kubenswrapper[4707]: I1127 16:23:30.681152 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerStarted","Data":"9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa"} Nov 27 16:23:30 crc kubenswrapper[4707]: I1127 16:23:30.681312 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 16:23:30 crc kubenswrapper[4707]: I1127 16:23:30.683248 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ee5c40c-527c-45b7-af82-93d55d4709c9","Type":"ContainerStarted","Data":"ed378e69c9b3f6462cf5eb1d03c2a042c13b958a896c88016241c615cf81bf42"} Nov 27 16:23:30 crc kubenswrapper[4707]: I1127 16:23:30.712641 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.279244169 podStartE2EDuration="5.712615424s" podCreationTimestamp="2025-11-27 16:23:25 +0000 UTC" firstStartedPulling="2025-11-27 16:23:26.47932573 +0000 UTC m=+1182.110774488" lastFinishedPulling="2025-11-27 16:23:29.912696975 +0000 UTC m=+1185.544145743" observedRunningTime="2025-11-27 16:23:30.706004444 +0000 UTC m=+1186.337453212" watchObservedRunningTime="2025-11-27 16:23:30.712615424 +0000 UTC m=+1186.344064192" Nov 27 16:23:31 crc kubenswrapper[4707]: I1127 16:23:31.206525 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc32b0f-9bae-4aed-a868-6b06465b570f" path="/var/lib/kubelet/pods/cbc32b0f-9bae-4aed-a868-6b06465b570f/volumes" Nov 27 16:23:31 crc kubenswrapper[4707]: I1127 16:23:31.694837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ee5c40c-527c-45b7-af82-93d55d4709c9","Type":"ContainerStarted","Data":"c9f41ae77dd4d06cd547a56aabdcac9b5585ea2629ed3848e0fd8d233bb1f986"} Nov 27 16:23:31 crc kubenswrapper[4707]: I1127 16:23:31.694906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3ee5c40c-527c-45b7-af82-93d55d4709c9","Type":"ContainerStarted","Data":"2606955b384f3c15cd54f39ee6aed0eb3a2053faa4f9743a5ad17eeb1f4f0d13"} Nov 27 16:23:31 crc kubenswrapper[4707]: I1127 16:23:31.759513 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.759491215 podStartE2EDuration="2.759491215s" podCreationTimestamp="2025-11-27 16:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:23:31.724579535 +0000 UTC m=+1187.356028303" watchObservedRunningTime="2025-11-27 16:23:31.759491215 +0000 UTC m=+1187.390939993" Nov 27 16:23:33 crc kubenswrapper[4707]: I1127 16:23:33.000051 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 16:23:33 crc kubenswrapper[4707]: I1127 16:23:33.624286 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:23:33 crc kubenswrapper[4707]: I1127 16:23:33.624397 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:23:33 crc kubenswrapper[4707]: I1127 16:23:33.624460 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:23:33 crc kubenswrapper[4707]: I1127 16:23:33.625444 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a80c79e17bd42a14677ea7ec5718ee1c93082c9c4030211d42f0b9a8e6591e20"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:23:33 crc kubenswrapper[4707]: I1127 16:23:33.625545 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://a80c79e17bd42a14677ea7ec5718ee1c93082c9c4030211d42f0b9a8e6591e20" gracePeriod=600 Nov 27 16:23:34 crc kubenswrapper[4707]: I1127 16:23:34.734266 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="a80c79e17bd42a14677ea7ec5718ee1c93082c9c4030211d42f0b9a8e6591e20" exitCode=0 Nov 27 16:23:34 crc kubenswrapper[4707]: I1127 16:23:34.734317 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"a80c79e17bd42a14677ea7ec5718ee1c93082c9c4030211d42f0b9a8e6591e20"} Nov 27 16:23:34 crc kubenswrapper[4707]: I1127 16:23:34.734787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"6c7e6d0aacc40003f1cf38f6710c940eb56f156be22a726feeaa22ebce682c5d"} Nov 27 16:23:34 crc kubenswrapper[4707]: I1127 16:23:34.734803 4707 scope.go:117] "RemoveContainer" containerID="8aa0ec55553e2030c537e5b750cef10ee68d7cb3cbe0ae6f95e1e594b84cdc37" Nov 27 16:23:35 crc kubenswrapper[4707]: I1127 16:23:35.066768 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 16:23:35 crc kubenswrapper[4707]: I1127 16:23:35.066822 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 16:23:37 crc kubenswrapper[4707]: E1127 16:23:37.019679 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc280ae8f_cb0b_4035_a7a2_fedfb044a6f0.slice/crio-54db668a5b818364925e4a6f0b61e5c19e947ee3510c3469d13f649fc2608566\": RecentStats: unable to find data in memory cache]" Nov 27 16:23:37 crc kubenswrapper[4707]: I1127 16:23:37.286172 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 16:23:37 crc kubenswrapper[4707]: I1127 16:23:37.286240 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 16:23:38 crc kubenswrapper[4707]: I1127 16:23:38.000512 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 16:23:38 crc kubenswrapper[4707]: I1127 16:23:38.059042 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 16:23:38 crc kubenswrapper[4707]: I1127 16:23:38.302537 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e31f3ae-2d72-4ba4-bf44-660b172c5066" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 16:23:38 crc kubenswrapper[4707]: I1127 16:23:38.302777 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e31f3ae-2d72-4ba4-bf44-660b172c5066" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 16:23:38 crc kubenswrapper[4707]: I1127 16:23:38.843045 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 16:23:40 crc kubenswrapper[4707]: I1127 16:23:40.066947 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 16:23:40 crc kubenswrapper[4707]: I1127 16:23:40.067078 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 16:23:41 crc kubenswrapper[4707]: I1127 16:23:41.090991 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3ee5c40c-527c-45b7-af82-93d55d4709c9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 16:23:41 crc kubenswrapper[4707]: I1127 16:23:41.090914 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3ee5c40c-527c-45b7-af82-93d55d4709c9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 16:23:47 crc kubenswrapper[4707]: I1127 16:23:47.306729 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 16:23:47 crc kubenswrapper[4707]: I1127 16:23:47.307629 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 16:23:47 crc kubenswrapper[4707]: I1127 16:23:47.311151 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 16:23:47 crc kubenswrapper[4707]: I1127 16:23:47.316177 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 16:23:47 crc kubenswrapper[4707]: I1127 16:23:47.898941 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 16:23:47 crc kubenswrapper[4707]: I1127 16:23:47.908951 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 16:23:50 crc kubenswrapper[4707]: I1127 16:23:50.071781 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 16:23:50 crc kubenswrapper[4707]: I1127 16:23:50.077769 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 16:23:50 crc kubenswrapper[4707]: I1127 16:23:50.083255 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 16:23:50 crc kubenswrapper[4707]: I1127 16:23:50.941073 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 16:23:56 crc kubenswrapper[4707]: I1127 16:23:56.002660 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 27 16:24:06 crc kubenswrapper[4707]: I1127 16:24:06.289281 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:24:07 crc kubenswrapper[4707]: I1127 16:24:07.350669 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:24:10 crc kubenswrapper[4707]: I1127 16:24:10.396452 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerName="rabbitmq" containerID="cri-o://9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e" gracePeriod=604796 Nov 27 16:24:11 crc kubenswrapper[4707]: I1127 16:24:11.579786 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Nov 27 16:24:11 crc kubenswrapper[4707]: I1127 16:24:11.709979 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerName="rabbitmq" containerID="cri-o://a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da" gracePeriod=604796 Nov 27 16:24:11 crc kubenswrapper[4707]: I1127 16:24:11.863550 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.077726 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194216 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-server-conf\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-confd\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194386 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-tls\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194412 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194469 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q74dh\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-kube-api-access-q74dh\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194516 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-plugins-conf\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194543 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-erlang-cookie\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194605 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b31a7b86-c43f-4123-a33d-ffba2ee3d015-erlang-cookie-secret\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b31a7b86-c43f-4123-a33d-ffba2ee3d015-pod-info\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194731 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-config-data\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.194780 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-plugins\") pod \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\" (UID: \"b31a7b86-c43f-4123-a33d-ffba2ee3d015\") " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.195548 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.195672 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.196085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.208545 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b31a7b86-c43f-4123-a33d-ffba2ee3d015-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.212795 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-kube-api-access-q74dh" (OuterVolumeSpecName: "kube-api-access-q74dh") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "kube-api-access-q74dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.218555 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.218561 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b31a7b86-c43f-4123-a33d-ffba2ee3d015-pod-info" (OuterVolumeSpecName: "pod-info") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.225247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.245659 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-config-data" (OuterVolumeSpecName: "config-data") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.255550 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-server-conf" (OuterVolumeSpecName: "server-conf") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.298617 4707 generic.go:334] "Generic (PLEG): container finished" podID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerID="9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e" exitCode=0 Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.298697 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299860 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299903 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299913 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q74dh\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-kube-api-access-q74dh\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299925 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299935 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299943 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b31a7b86-c43f-4123-a33d-ffba2ee3d015-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299950 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b31a7b86-c43f-4123-a33d-ffba2ee3d015-pod-info\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299960 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299969 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.299976 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b31a7b86-c43f-4123-a33d-ffba2ee3d015-server-conf\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.304679 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b31a7b86-c43f-4123-a33d-ffba2ee3d015","Type":"ContainerDied","Data":"9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e"} Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.304783 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b31a7b86-c43f-4123-a33d-ffba2ee3d015","Type":"ContainerDied","Data":"df01efea3bad1d8159514b124f5783efe2cc1351406f89eef8a9867fc08130c6"} Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.304859 4707 scope.go:117] "RemoveContainer" containerID="9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.326808 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.348406 4707 scope.go:117] "RemoveContainer" containerID="422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.355823 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b31a7b86-c43f-4123-a33d-ffba2ee3d015" (UID: "b31a7b86-c43f-4123-a33d-ffba2ee3d015"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.368358 4707 scope.go:117] "RemoveContainer" containerID="9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e" Nov 27 16:24:17 crc kubenswrapper[4707]: E1127 16:24:17.368830 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e\": container with ID starting with 9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e not found: ID does not exist" containerID="9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.368916 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e"} err="failed to get container status \"9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e\": rpc error: code = NotFound desc = could not find container \"9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e\": container with ID starting with 9be83be108fd0cd2b2af945d1754e6348621417599bb74d5879c690e62c8885e not found: ID does not exist" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.369006 4707 scope.go:117] "RemoveContainer" containerID="422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e" Nov 27 16:24:17 crc kubenswrapper[4707]: E1127 16:24:17.369358 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e\": container with ID starting with 422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e not found: ID does not exist" containerID="422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.369447 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e"} err="failed to get container status \"422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e\": rpc error: code = NotFound desc = could not find container \"422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e\": container with ID starting with 422f2d77f62c72279e75322c43beedda7bb264d013ad32e8514fa1cdf5f2c05e not found: ID does not exist" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.401455 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b31a7b86-c43f-4123-a33d-ffba2ee3d015-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.401666 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.702529 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.715582 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.735707 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:24:17 crc kubenswrapper[4707]: E1127 16:24:17.736103 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerName="rabbitmq" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.736125 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerName="rabbitmq" Nov 27 16:24:17 crc kubenswrapper[4707]: E1127 16:24:17.736170 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerName="setup-container" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.736180 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerName="setup-container" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.736386 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" containerName="rabbitmq" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.737307 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.739937 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.740094 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.740135 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.741340 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.741835 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.742437 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.742572 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wwft8" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.785100 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.809433 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.809745 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.809912 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.810007 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.810135 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnpvh\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-kube-api-access-bnpvh\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.810210 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.811198 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.811314 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.811493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-config-data\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.811649 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.811759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913124 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913193 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913271 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913316 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnpvh\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-kube-api-access-bnpvh\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913339 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913451 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913479 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-config-data\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913535 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.913582 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.914581 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.914803 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.915507 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.916193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-config-data\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.916313 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.916476 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.918943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.919850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.920016 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.920795 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.939763 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnpvh\" (UniqueName: \"kubernetes.io/projected/82ba4b51-2b4f-4ed6-8ef9-453386ff71da-kube-api-access-bnpvh\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:17 crc kubenswrapper[4707]: I1127 16:24:17.958937 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"82ba4b51-2b4f-4ed6-8ef9-453386ff71da\") " pod="openstack/rabbitmq-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.058764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: E1127 16:24:18.072179 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod517a2efb_7c9f_4c93_876b_5962da604ef8.slice/crio-a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod517a2efb_7c9f_4c93_876b_5962da604ef8.slice/crio-conmon-a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da.scope\": RecentStats: unable to find data in memory cache]" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.202103 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.312916 4707 generic.go:334] "Generic (PLEG): container finished" podID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerID="a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da" exitCode=0 Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.312980 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"517a2efb-7c9f-4c93-876b-5962da604ef8","Type":"ContainerDied","Data":"a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da"} Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.313001 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.313005 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"517a2efb-7c9f-4c93-876b-5962da604ef8","Type":"ContainerDied","Data":"c53e6c79d3296ea4c3efd8e59f1e7a6e8cc4c6130b53b6eb4b6cb8b7c8cf5785"} Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.313421 4707 scope.go:117] "RemoveContainer" containerID="a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326278 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-tls\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326330 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-server-conf\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326430 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-plugins\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-config-data\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326484 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/517a2efb-7c9f-4c93-876b-5962da604ef8-erlang-cookie-secret\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326586 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326603 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-confd\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326655 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-erlang-cookie\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326715 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqwl4\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-kube-api-access-hqwl4\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326733 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/517a2efb-7c9f-4c93-876b-5962da604ef8-pod-info\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.326804 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-plugins-conf\") pod \"517a2efb-7c9f-4c93-876b-5962da604ef8\" (UID: \"517a2efb-7c9f-4c93-876b-5962da604ef8\") " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.333518 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.334269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.335448 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.336191 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.339306 4707 scope.go:117] "RemoveContainer" containerID="df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.340486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/517a2efb-7c9f-4c93-876b-5962da604ef8-pod-info" (OuterVolumeSpecName: "pod-info") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.341696 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-kube-api-access-hqwl4" (OuterVolumeSpecName: "kube-api-access-hqwl4") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "kube-api-access-hqwl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.341862 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.345062 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517a2efb-7c9f-4c93-876b-5962da604ef8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.361563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-config-data" (OuterVolumeSpecName: "config-data") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.368048 4707 scope.go:117] "RemoveContainer" containerID="a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da" Nov 27 16:24:18 crc kubenswrapper[4707]: E1127 16:24:18.368528 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da\": container with ID starting with a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da not found: ID does not exist" containerID="a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.368563 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da"} err="failed to get container status \"a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da\": rpc error: code = NotFound desc = could not find container \"a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da\": container with ID starting with a5155d0fa33d5e609f9f6c6cbec47aa989b67e73bf27a2ce322263b0929f14da not found: ID does not exist" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.368585 4707 scope.go:117] "RemoveContainer" containerID="df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60" Nov 27 16:24:18 crc kubenswrapper[4707]: E1127 16:24:18.368891 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60\": container with ID starting with df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60 not found: ID does not exist" containerID="df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.368927 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60"} err="failed to get container status \"df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60\": rpc error: code = NotFound desc = could not find container \"df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60\": container with ID starting with df03a765a612e1232ddf145ad82f405beba655bd4286069693c18c9067e1ee60 not found: ID does not exist" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.408724 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-server-conf" (OuterVolumeSpecName: "server-conf") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.429640 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.429671 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.429983 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqwl4\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-kube-api-access-hqwl4\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.430069 4707 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/517a2efb-7c9f-4c93-876b-5962da604ef8-pod-info\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.430149 4707 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.430164 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.430174 4707 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-server-conf\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.430182 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.430190 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/517a2efb-7c9f-4c93-876b-5962da604ef8-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.430198 4707 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/517a2efb-7c9f-4c93-876b-5962da604ef8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.459052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "517a2efb-7c9f-4c93-876b-5962da604ef8" (UID: "517a2efb-7c9f-4c93-876b-5962da604ef8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.460694 4707 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 27 16:24:18 crc kubenswrapper[4707]: W1127 16:24:18.494122 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ba4b51_2b4f_4ed6_8ef9_453386ff71da.slice/crio-75366eb9278fe9d66f731548d94d20b4c576d8354980375725b34b9f6b9852ed WatchSource:0}: Error finding container 75366eb9278fe9d66f731548d94d20b4c576d8354980375725b34b9f6b9852ed: Status 404 returned error can't find the container with id 75366eb9278fe9d66f731548d94d20b4c576d8354980375725b34b9f6b9852ed Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.502102 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.532036 4707 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.532070 4707 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/517a2efb-7c9f-4c93-876b-5962da604ef8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.697532 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.705398 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.731250 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:24:18 crc kubenswrapper[4707]: E1127 16:24:18.731847 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerName="setup-container" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.731915 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerName="setup-container" Nov 27 16:24:18 crc kubenswrapper[4707]: E1127 16:24:18.732004 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerName="rabbitmq" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.732063 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerName="rabbitmq" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.732302 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" containerName="rabbitmq" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.733282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.735397 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.735570 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.735809 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.740168 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.740520 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.740771 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.741024 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-p26dh" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.743790 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.837134 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4990dbfc-6c12-4964-9d50-b8fd331cc123-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.837180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.837296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.837403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.837561 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.837634 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5pl\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-kube-api-access-hx5pl\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.837848 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.837927 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.838095 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.838131 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.838165 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4990dbfc-6c12-4964-9d50-b8fd331cc123-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.939871 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.939915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.939939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4990dbfc-6c12-4964-9d50-b8fd331cc123-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.939972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4990dbfc-6c12-4964-9d50-b8fd331cc123-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.939992 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.940021 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.940047 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.940088 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.940112 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5pl\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-kube-api-access-hx5pl\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.940142 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.940160 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.940286 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.941363 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.941547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.941850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.942044 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.942217 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4990dbfc-6c12-4964-9d50-b8fd331cc123-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.946099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4990dbfc-6c12-4964-9d50-b8fd331cc123-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.948188 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.950286 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.956083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4990dbfc-6c12-4964-9d50-b8fd331cc123-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.956957 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5pl\" (UniqueName: \"kubernetes.io/projected/4990dbfc-6c12-4964-9d50-b8fd331cc123-kube-api-access-hx5pl\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:18 crc kubenswrapper[4707]: I1127 16:24:18.984017 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4990dbfc-6c12-4964-9d50-b8fd331cc123\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:19 crc kubenswrapper[4707]: I1127 16:24:19.097171 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:19 crc kubenswrapper[4707]: I1127 16:24:19.209602 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517a2efb-7c9f-4c93-876b-5962da604ef8" path="/var/lib/kubelet/pods/517a2efb-7c9f-4c93-876b-5962da604ef8/volumes" Nov 27 16:24:19 crc kubenswrapper[4707]: I1127 16:24:19.210626 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31a7b86-c43f-4123-a33d-ffba2ee3d015" path="/var/lib/kubelet/pods/b31a7b86-c43f-4123-a33d-ffba2ee3d015/volumes" Nov 27 16:24:19 crc kubenswrapper[4707]: I1127 16:24:19.329079 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82ba4b51-2b4f-4ed6-8ef9-453386ff71da","Type":"ContainerStarted","Data":"75366eb9278fe9d66f731548d94d20b4c576d8354980375725b34b9f6b9852ed"} Nov 27 16:24:19 crc kubenswrapper[4707]: I1127 16:24:19.569296 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 16:24:19 crc kubenswrapper[4707]: W1127 16:24:19.665567 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4990dbfc_6c12_4964_9d50_b8fd331cc123.slice/crio-8a296bd984e62f7d1cea8e15d8e1474fb60a7e5ec7ccef97792e23ed61e663bb WatchSource:0}: Error finding container 8a296bd984e62f7d1cea8e15d8e1474fb60a7e5ec7ccef97792e23ed61e663bb: Status 404 returned error can't find the container with id 8a296bd984e62f7d1cea8e15d8e1474fb60a7e5ec7ccef97792e23ed61e663bb Nov 27 16:24:20 crc kubenswrapper[4707]: I1127 16:24:20.338693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4990dbfc-6c12-4964-9d50-b8fd331cc123","Type":"ContainerStarted","Data":"8a296bd984e62f7d1cea8e15d8e1474fb60a7e5ec7ccef97792e23ed61e663bb"} Nov 27 16:24:20 crc kubenswrapper[4707]: I1127 16:24:20.340115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82ba4b51-2b4f-4ed6-8ef9-453386ff71da","Type":"ContainerStarted","Data":"52620c7359f5301cd909ca7b342538e5438835d89183d54ab2d1d91bc559434c"} Nov 27 16:24:21 crc kubenswrapper[4707]: I1127 16:24:21.355654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4990dbfc-6c12-4964-9d50-b8fd331cc123","Type":"ContainerStarted","Data":"e78e9bcd20e51dbbc2520ac3d46bf17d5696c6f81c1e882218d7b9763e80b761"} Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.253035 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-27q24"] Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.266084 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-27q24"] Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.266196 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.270444 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.298799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.298935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.298970 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.299036 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x72mc\" (UniqueName: \"kubernetes.io/projected/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-kube-api-access-x72mc\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.299197 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-config\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.299567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.299763 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.401915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.402622 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.402749 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.402956 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x72mc\" (UniqueName: \"kubernetes.io/projected/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-kube-api-access-x72mc\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.403084 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-config\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.403265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.403465 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.403504 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.403508 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.404099 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.404168 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-config\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.404289 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.404662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.434662 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x72mc\" (UniqueName: \"kubernetes.io/projected/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-kube-api-access-x72mc\") pod \"dnsmasq-dns-7d84b4d45c-27q24\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:22 crc kubenswrapper[4707]: I1127 16:24:22.613214 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:23 crc kubenswrapper[4707]: I1127 16:24:23.135356 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-27q24"] Nov 27 16:24:23 crc kubenswrapper[4707]: W1127 16:24:23.136303 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1e61dc3_dfef_48ca_92bf_d9ee48ffcdba.slice/crio-5c60bccb42b045d97b014a2c67c75ae59bf4dabdc0152482c3c8070613eb75ca WatchSource:0}: Error finding container 5c60bccb42b045d97b014a2c67c75ae59bf4dabdc0152482c3c8070613eb75ca: Status 404 returned error can't find the container with id 5c60bccb42b045d97b014a2c67c75ae59bf4dabdc0152482c3c8070613eb75ca Nov 27 16:24:23 crc kubenswrapper[4707]: I1127 16:24:23.378376 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" event={"ID":"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba","Type":"ContainerStarted","Data":"5c60bccb42b045d97b014a2c67c75ae59bf4dabdc0152482c3c8070613eb75ca"} Nov 27 16:24:24 crc kubenswrapper[4707]: I1127 16:24:24.389886 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" containerID="00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4" exitCode=0 Nov 27 16:24:24 crc kubenswrapper[4707]: I1127 16:24:24.389947 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" event={"ID":"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba","Type":"ContainerDied","Data":"00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4"} Nov 27 16:24:25 crc kubenswrapper[4707]: I1127 16:24:25.405560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" event={"ID":"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba","Type":"ContainerStarted","Data":"5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c"} Nov 27 16:24:25 crc kubenswrapper[4707]: I1127 16:24:25.405911 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:25 crc kubenswrapper[4707]: I1127 16:24:25.459267 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" podStartSLOduration=3.459247354 podStartE2EDuration="3.459247354s" podCreationTimestamp="2025-11-27 16:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:24:25.440222551 +0000 UTC m=+1241.071671389" watchObservedRunningTime="2025-11-27 16:24:25.459247354 +0000 UTC m=+1241.090696132" Nov 27 16:24:32 crc kubenswrapper[4707]: I1127 16:24:32.614677 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:32 crc kubenswrapper[4707]: I1127 16:24:32.716351 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hmvth"] Nov 27 16:24:32 crc kubenswrapper[4707]: I1127 16:24:32.716921 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" podUID="bc688de8-7455-42c0-94e5-caada732ba70" containerName="dnsmasq-dns" containerID="cri-o://edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac" gracePeriod=10 Nov 27 16:24:32 crc kubenswrapper[4707]: I1127 16:24:32.929372 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-pxkhr"] Nov 27 16:24:32 crc kubenswrapper[4707]: I1127 16:24:32.933954 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:32 crc kubenswrapper[4707]: I1127 16:24:32.948275 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-pxkhr"] Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.130971 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.131305 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.131355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.131679 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.131760 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqv8m\" (UniqueName: \"kubernetes.io/projected/3f07f078-fb9b-425d-9575-520a406e4178-kube-api-access-xqv8m\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.131791 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.131942 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-config\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.231672 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.234007 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.234424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqv8m\" (UniqueName: \"kubernetes.io/projected/3f07f078-fb9b-425d-9575-520a406e4178-kube-api-access-xqv8m\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.234460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.234567 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-config\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.234614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.234651 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.234691 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.236792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.240229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.240264 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-config\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.240812 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.240892 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.241114 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3f07f078-fb9b-425d-9575-520a406e4178-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.257669 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqv8m\" (UniqueName: \"kubernetes.io/projected/3f07f078-fb9b-425d-9575-520a406e4178-kube-api-access-xqv8m\") pod \"dnsmasq-dns-6f6df4f56c-pxkhr\" (UID: \"3f07f078-fb9b-425d-9575-520a406e4178\") " pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.336051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-svc\") pod \"bc688de8-7455-42c0-94e5-caada732ba70\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.336142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-sb\") pod \"bc688de8-7455-42c0-94e5-caada732ba70\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.336204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pndvt\" (UniqueName: \"kubernetes.io/projected/bc688de8-7455-42c0-94e5-caada732ba70-kube-api-access-pndvt\") pod \"bc688de8-7455-42c0-94e5-caada732ba70\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.336237 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-nb\") pod \"bc688de8-7455-42c0-94e5-caada732ba70\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.336256 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-config\") pod \"bc688de8-7455-42c0-94e5-caada732ba70\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.336362 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-swift-storage-0\") pod \"bc688de8-7455-42c0-94e5-caada732ba70\" (UID: \"bc688de8-7455-42c0-94e5-caada732ba70\") " Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.351058 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc688de8-7455-42c0-94e5-caada732ba70-kube-api-access-pndvt" (OuterVolumeSpecName: "kube-api-access-pndvt") pod "bc688de8-7455-42c0-94e5-caada732ba70" (UID: "bc688de8-7455-42c0-94e5-caada732ba70"). InnerVolumeSpecName "kube-api-access-pndvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.380113 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc688de8-7455-42c0-94e5-caada732ba70" (UID: "bc688de8-7455-42c0-94e5-caada732ba70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.382503 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bc688de8-7455-42c0-94e5-caada732ba70" (UID: "bc688de8-7455-42c0-94e5-caada732ba70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.388273 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc688de8-7455-42c0-94e5-caada732ba70" (UID: "bc688de8-7455-42c0-94e5-caada732ba70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.394693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc688de8-7455-42c0-94e5-caada732ba70" (UID: "bc688de8-7455-42c0-94e5-caada732ba70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.408012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-config" (OuterVolumeSpecName: "config") pod "bc688de8-7455-42c0-94e5-caada732ba70" (UID: "bc688de8-7455-42c0-94e5-caada732ba70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.438716 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.438764 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pndvt\" (UniqueName: \"kubernetes.io/projected/bc688de8-7455-42c0-94e5-caada732ba70-kube-api-access-pndvt\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.438777 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.438787 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.438799 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.438809 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc688de8-7455-42c0-94e5-caada732ba70-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.515592 4707 generic.go:334] "Generic (PLEG): container finished" podID="bc688de8-7455-42c0-94e5-caada732ba70" containerID="edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac" exitCode=0 Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.515647 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.516030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" event={"ID":"bc688de8-7455-42c0-94e5-caada732ba70","Type":"ContainerDied","Data":"edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac"} Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.516240 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-hmvth" event={"ID":"bc688de8-7455-42c0-94e5-caada732ba70","Type":"ContainerDied","Data":"482c1ae93b202df8cc3477b968ee316b94d576887154ff4a4ee49e2589868b42"} Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.516434 4707 scope.go:117] "RemoveContainer" containerID="edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.549465 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.573662 4707 scope.go:117] "RemoveContainer" containerID="b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.583033 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hmvth"] Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.605318 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-hmvth"] Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.613408 4707 scope.go:117] "RemoveContainer" containerID="edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac" Nov 27 16:24:33 crc kubenswrapper[4707]: E1127 16:24:33.613931 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac\": container with ID starting with edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac not found: ID does not exist" containerID="edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.614070 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac"} err="failed to get container status \"edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac\": rpc error: code = NotFound desc = could not find container \"edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac\": container with ID starting with edde7c5ce1c8c5097444a549a213ef6e054ab156c5f1996408df99f38e640eac not found: ID does not exist" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.614188 4707 scope.go:117] "RemoveContainer" containerID="b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56" Nov 27 16:24:33 crc kubenswrapper[4707]: E1127 16:24:33.614851 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56\": container with ID starting with b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56 not found: ID does not exist" containerID="b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.614899 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56"} err="failed to get container status \"b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56\": rpc error: code = NotFound desc = could not find container \"b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56\": container with ID starting with b3f6c840f5d846fd8e791f1857a616a91aa57cfc1ca9f5c2c25b810183220d56 not found: ID does not exist" Nov 27 16:24:33 crc kubenswrapper[4707]: I1127 16:24:33.982124 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-pxkhr"] Nov 27 16:24:34 crc kubenswrapper[4707]: I1127 16:24:34.532149 4707 generic.go:334] "Generic (PLEG): container finished" podID="3f07f078-fb9b-425d-9575-520a406e4178" containerID="b8ffa014f61fda3d8e35e430a3331549d15334cbba0b33394b8834f954017d20" exitCode=0 Nov 27 16:24:34 crc kubenswrapper[4707]: I1127 16:24:34.532215 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" event={"ID":"3f07f078-fb9b-425d-9575-520a406e4178","Type":"ContainerDied","Data":"b8ffa014f61fda3d8e35e430a3331549d15334cbba0b33394b8834f954017d20"} Nov 27 16:24:34 crc kubenswrapper[4707]: I1127 16:24:34.532498 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" event={"ID":"3f07f078-fb9b-425d-9575-520a406e4178","Type":"ContainerStarted","Data":"5646a1b9edfddc6d1bebe029b3585ccf6137466bcc7e68a1463413656e6586a9"} Nov 27 16:24:35 crc kubenswrapper[4707]: I1127 16:24:35.228535 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc688de8-7455-42c0-94e5-caada732ba70" path="/var/lib/kubelet/pods/bc688de8-7455-42c0-94e5-caada732ba70/volumes" Nov 27 16:24:35 crc kubenswrapper[4707]: I1127 16:24:35.545753 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" event={"ID":"3f07f078-fb9b-425d-9575-520a406e4178","Type":"ContainerStarted","Data":"6ddafe626ea315e738ee375ec053fc4f63468a741c17c9edbd3fa2382a94cb5d"} Nov 27 16:24:35 crc kubenswrapper[4707]: I1127 16:24:35.546541 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:35 crc kubenswrapper[4707]: I1127 16:24:35.586289 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" podStartSLOduration=3.586262395 podStartE2EDuration="3.586262395s" podCreationTimestamp="2025-11-27 16:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:24:35.574099049 +0000 UTC m=+1251.205547857" watchObservedRunningTime="2025-11-27 16:24:35.586262395 +0000 UTC m=+1251.217711203" Nov 27 16:24:43 crc kubenswrapper[4707]: I1127 16:24:43.552526 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-pxkhr" Nov 27 16:24:43 crc kubenswrapper[4707]: I1127 16:24:43.667558 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-27q24"] Nov 27 16:24:43 crc kubenswrapper[4707]: I1127 16:24:43.668064 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" podUID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" containerName="dnsmasq-dns" containerID="cri-o://5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c" gracePeriod=10 Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.120688 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.189672 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-openstack-edpm-ipam\") pod \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.189826 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x72mc\" (UniqueName: \"kubernetes.io/projected/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-kube-api-access-x72mc\") pod \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.189852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-sb\") pod \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.189876 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-config\") pod \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.190002 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-svc\") pod \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.190045 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-swift-storage-0\") pod \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.190125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-nb\") pod \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\" (UID: \"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba\") " Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.198117 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-kube-api-access-x72mc" (OuterVolumeSpecName: "kube-api-access-x72mc") pod "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" (UID: "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba"). InnerVolumeSpecName "kube-api-access-x72mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.255513 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" (UID: "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.259670 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-config" (OuterVolumeSpecName: "config") pod "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" (UID: "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.262337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" (UID: "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.271023 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" (UID: "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.287176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" (UID: "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.288264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" (UID: "c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.292199 4707 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.292220 4707 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.292232 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.292240 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.292249 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x72mc\" (UniqueName: \"kubernetes.io/projected/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-kube-api-access-x72mc\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.292258 4707 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.292266 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.714603 4707 generic.go:334] "Generic (PLEG): container finished" podID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" containerID="5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c" exitCode=0 Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.714668 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" event={"ID":"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba","Type":"ContainerDied","Data":"5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c"} Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.714708 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" event={"ID":"c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba","Type":"ContainerDied","Data":"5c60bccb42b045d97b014a2c67c75ae59bf4dabdc0152482c3c8070613eb75ca"} Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.714735 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-27q24" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.714750 4707 scope.go:117] "RemoveContainer" containerID="5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.754438 4707 scope.go:117] "RemoveContainer" containerID="00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.760442 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-27q24"] Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.768701 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-27q24"] Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.785002 4707 scope.go:117] "RemoveContainer" containerID="5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c" Nov 27 16:24:44 crc kubenswrapper[4707]: E1127 16:24:44.785520 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c\": container with ID starting with 5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c not found: ID does not exist" containerID="5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.785580 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c"} err="failed to get container status \"5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c\": rpc error: code = NotFound desc = could not find container \"5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c\": container with ID starting with 5ef4631afd68879f9d3feaaa57c96cd518034edb0a3169c8fb17a4746337ea9c not found: ID does not exist" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.785610 4707 scope.go:117] "RemoveContainer" containerID="00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4" Nov 27 16:24:44 crc kubenswrapper[4707]: E1127 16:24:44.786109 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4\": container with ID starting with 00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4 not found: ID does not exist" containerID="00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4" Nov 27 16:24:44 crc kubenswrapper[4707]: I1127 16:24:44.786159 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4"} err="failed to get container status \"00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4\": rpc error: code = NotFound desc = could not find container \"00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4\": container with ID starting with 00e10e692ce367800843c86958eb47c038efc131735e59a0415ed9b25a7f3ce4 not found: ID does not exist" Nov 27 16:24:45 crc kubenswrapper[4707]: I1127 16:24:45.214583 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" path="/var/lib/kubelet/pods/c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba/volumes" Nov 27 16:24:52 crc kubenswrapper[4707]: I1127 16:24:52.821649 4707 generic.go:334] "Generic (PLEG): container finished" podID="82ba4b51-2b4f-4ed6-8ef9-453386ff71da" containerID="52620c7359f5301cd909ca7b342538e5438835d89183d54ab2d1d91bc559434c" exitCode=0 Nov 27 16:24:52 crc kubenswrapper[4707]: I1127 16:24:52.821733 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82ba4b51-2b4f-4ed6-8ef9-453386ff71da","Type":"ContainerDied","Data":"52620c7359f5301cd909ca7b342538e5438835d89183d54ab2d1d91bc559434c"} Nov 27 16:24:53 crc kubenswrapper[4707]: I1127 16:24:53.832843 4707 generic.go:334] "Generic (PLEG): container finished" podID="4990dbfc-6c12-4964-9d50-b8fd331cc123" containerID="e78e9bcd20e51dbbc2520ac3d46bf17d5696c6f81c1e882218d7b9763e80b761" exitCode=0 Nov 27 16:24:53 crc kubenswrapper[4707]: I1127 16:24:53.832931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4990dbfc-6c12-4964-9d50-b8fd331cc123","Type":"ContainerDied","Data":"e78e9bcd20e51dbbc2520ac3d46bf17d5696c6f81c1e882218d7b9763e80b761"} Nov 27 16:24:53 crc kubenswrapper[4707]: I1127 16:24:53.835710 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82ba4b51-2b4f-4ed6-8ef9-453386ff71da","Type":"ContainerStarted","Data":"d27d93ef2ab28bd6d4fb0657c919c44ab19ec445b29ea187af13f7d53796568d"} Nov 27 16:24:53 crc kubenswrapper[4707]: I1127 16:24:53.835913 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 27 16:24:53 crc kubenswrapper[4707]: I1127 16:24:53.894056 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.894040198 podStartE2EDuration="36.894040198s" podCreationTimestamp="2025-11-27 16:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:24:53.892944221 +0000 UTC m=+1269.524393009" watchObservedRunningTime="2025-11-27 16:24:53.894040198 +0000 UTC m=+1269.525488966" Nov 27 16:24:54 crc kubenswrapper[4707]: I1127 16:24:54.845226 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4990dbfc-6c12-4964-9d50-b8fd331cc123","Type":"ContainerStarted","Data":"5629060ee720d829a4464b2ed10b441d04b89dc7e345a5266fa49cacd02bbf32"} Nov 27 16:24:54 crc kubenswrapper[4707]: I1127 16:24:54.846078 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:24:54 crc kubenswrapper[4707]: I1127 16:24:54.879204 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.879188436 podStartE2EDuration="36.879188436s" podCreationTimestamp="2025-11-27 16:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:24:54.877551196 +0000 UTC m=+1270.509000004" watchObservedRunningTime="2025-11-27 16:24:54.879188436 +0000 UTC m=+1270.510637204" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.573913 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx"] Nov 27 16:24:56 crc kubenswrapper[4707]: E1127 16:24:56.574924 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" containerName="dnsmasq-dns" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.574954 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" containerName="dnsmasq-dns" Nov 27 16:24:56 crc kubenswrapper[4707]: E1127 16:24:56.575014 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc688de8-7455-42c0-94e5-caada732ba70" containerName="init" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.575031 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc688de8-7455-42c0-94e5-caada732ba70" containerName="init" Nov 27 16:24:56 crc kubenswrapper[4707]: E1127 16:24:56.575055 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc688de8-7455-42c0-94e5-caada732ba70" containerName="dnsmasq-dns" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.575068 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc688de8-7455-42c0-94e5-caada732ba70" containerName="dnsmasq-dns" Nov 27 16:24:56 crc kubenswrapper[4707]: E1127 16:24:56.575099 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" containerName="init" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.575112 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" containerName="init" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.575493 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc688de8-7455-42c0-94e5-caada732ba70" containerName="dnsmasq-dns" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.575552 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e61dc3-dfef-48ca-92bf-d9ee48ffcdba" containerName="dnsmasq-dns" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.576831 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.579155 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.579991 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.580310 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.581301 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.594771 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx"] Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.773038 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k7r2\" (UniqueName: \"kubernetes.io/projected/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-kube-api-access-2k7r2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.773101 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.773152 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.773257 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.875199 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k7r2\" (UniqueName: \"kubernetes.io/projected/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-kube-api-access-2k7r2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.875251 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.875299 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.875407 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.881460 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.881622 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.882038 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.896830 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k7r2\" (UniqueName: \"kubernetes.io/projected/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-kube-api-access-2k7r2\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:56 crc kubenswrapper[4707]: I1127 16:24:56.933238 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:24:57 crc kubenswrapper[4707]: I1127 16:24:57.538148 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx"] Nov 27 16:24:57 crc kubenswrapper[4707]: W1127 16:24:57.550241 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92f7fdf9_f4c8_443c_9ff6_b6a45719b9a7.slice/crio-959f4c1698a541c24a77c358a77af6c42851edd8283de2a2dd06909c9f8b79e0 WatchSource:0}: Error finding container 959f4c1698a541c24a77c358a77af6c42851edd8283de2a2dd06909c9f8b79e0: Status 404 returned error can't find the container with id 959f4c1698a541c24a77c358a77af6c42851edd8283de2a2dd06909c9f8b79e0 Nov 27 16:24:57 crc kubenswrapper[4707]: I1127 16:24:57.555242 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:24:57 crc kubenswrapper[4707]: I1127 16:24:57.876163 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" event={"ID":"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7","Type":"ContainerStarted","Data":"959f4c1698a541c24a77c358a77af6c42851edd8283de2a2dd06909c9f8b79e0"} Nov 27 16:25:07 crc kubenswrapper[4707]: I1127 16:25:07.994613 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" event={"ID":"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7","Type":"ContainerStarted","Data":"bc4cd6fb764ee0a00676d1c905c75d7a05ef7f5d6de0d4154ec6e9aab895b39b"} Nov 27 16:25:08 crc kubenswrapper[4707]: I1127 16:25:08.016924 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" podStartSLOduration=2.646938888 podStartE2EDuration="12.016905284s" podCreationTimestamp="2025-11-27 16:24:56 +0000 UTC" firstStartedPulling="2025-11-27 16:24:57.554883599 +0000 UTC m=+1273.186332377" lastFinishedPulling="2025-11-27 16:25:06.924849995 +0000 UTC m=+1282.556298773" observedRunningTime="2025-11-27 16:25:08.015348647 +0000 UTC m=+1283.646797435" watchObservedRunningTime="2025-11-27 16:25:08.016905284 +0000 UTC m=+1283.648354063" Nov 27 16:25:08 crc kubenswrapper[4707]: I1127 16:25:08.061631 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 27 16:25:09 crc kubenswrapper[4707]: I1127 16:25:09.101698 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 27 16:25:19 crc kubenswrapper[4707]: I1127 16:25:19.113757 4707 generic.go:334] "Generic (PLEG): container finished" podID="92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7" containerID="bc4cd6fb764ee0a00676d1c905c75d7a05ef7f5d6de0d4154ec6e9aab895b39b" exitCode=0 Nov 27 16:25:19 crc kubenswrapper[4707]: I1127 16:25:19.113837 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" event={"ID":"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7","Type":"ContainerDied","Data":"bc4cd6fb764ee0a00676d1c905c75d7a05ef7f5d6de0d4154ec6e9aab895b39b"} Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.682608 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.881060 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-ssh-key\") pod \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.881821 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k7r2\" (UniqueName: \"kubernetes.io/projected/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-kube-api-access-2k7r2\") pod \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.882051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-repo-setup-combined-ca-bundle\") pod \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.882219 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-inventory\") pod \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\" (UID: \"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7\") " Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.888931 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7" (UID: "92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.889200 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-kube-api-access-2k7r2" (OuterVolumeSpecName: "kube-api-access-2k7r2") pod "92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7" (UID: "92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7"). InnerVolumeSpecName "kube-api-access-2k7r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.934242 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7" (UID: "92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.935012 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-inventory" (OuterVolumeSpecName: "inventory") pod "92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7" (UID: "92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.986016 4707 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.986068 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.986089 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:25:20 crc kubenswrapper[4707]: I1127 16:25:20.986108 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k7r2\" (UniqueName: \"kubernetes.io/projected/92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7-kube-api-access-2k7r2\") on node \"crc\" DevicePath \"\"" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.141937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" event={"ID":"92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7","Type":"ContainerDied","Data":"959f4c1698a541c24a77c358a77af6c42851edd8283de2a2dd06909c9f8b79e0"} Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.141995 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="959f4c1698a541c24a77c358a77af6c42851edd8283de2a2dd06909c9f8b79e0" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.142027 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.247047 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g"] Nov 27 16:25:21 crc kubenswrapper[4707]: E1127 16:25:21.247506 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.247527 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.247760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.251169 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.253359 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.255803 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.257318 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.257523 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.265119 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g"] Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.391526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flnbk\" (UniqueName: \"kubernetes.io/projected/6f4ca349-556f-4de4-b23d-b00a59768241-kube-api-access-flnbk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.391631 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.391656 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.492918 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.493201 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.493727 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flnbk\" (UniqueName: \"kubernetes.io/projected/6f4ca349-556f-4de4-b23d-b00a59768241-kube-api-access-flnbk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.500018 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.500279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.511031 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flnbk\" (UniqueName: \"kubernetes.io/projected/6f4ca349-556f-4de4-b23d-b00a59768241-kube-api-access-flnbk\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p954g\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:21 crc kubenswrapper[4707]: I1127 16:25:21.568131 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:22 crc kubenswrapper[4707]: I1127 16:25:22.083008 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g"] Nov 27 16:25:22 crc kubenswrapper[4707]: I1127 16:25:22.153998 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" event={"ID":"6f4ca349-556f-4de4-b23d-b00a59768241","Type":"ContainerStarted","Data":"eba06005477c4aca6295fb35ba0693bd25ebeb822b93bfc744ae6d3df7f2f9f4"} Nov 27 16:25:23 crc kubenswrapper[4707]: I1127 16:25:23.167081 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" event={"ID":"6f4ca349-556f-4de4-b23d-b00a59768241","Type":"ContainerStarted","Data":"f29b61d2b9022adf11e75c4de1639846267fd130c88d8aa20c5edbdbb4034062"} Nov 27 16:25:23 crc kubenswrapper[4707]: I1127 16:25:23.192972 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" podStartSLOduration=1.590529283 podStartE2EDuration="2.192953475s" podCreationTimestamp="2025-11-27 16:25:21 +0000 UTC" firstStartedPulling="2025-11-27 16:25:22.087219532 +0000 UTC m=+1297.718668290" lastFinishedPulling="2025-11-27 16:25:22.689643684 +0000 UTC m=+1298.321092482" observedRunningTime="2025-11-27 16:25:23.184487729 +0000 UTC m=+1298.815936507" watchObservedRunningTime="2025-11-27 16:25:23.192953475 +0000 UTC m=+1298.824402253" Nov 27 16:25:26 crc kubenswrapper[4707]: I1127 16:25:26.204792 4707 generic.go:334] "Generic (PLEG): container finished" podID="6f4ca349-556f-4de4-b23d-b00a59768241" containerID="f29b61d2b9022adf11e75c4de1639846267fd130c88d8aa20c5edbdbb4034062" exitCode=0 Nov 27 16:25:26 crc kubenswrapper[4707]: I1127 16:25:26.204924 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" event={"ID":"6f4ca349-556f-4de4-b23d-b00a59768241","Type":"ContainerDied","Data":"f29b61d2b9022adf11e75c4de1639846267fd130c88d8aa20c5edbdbb4034062"} Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.704910 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.819439 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flnbk\" (UniqueName: \"kubernetes.io/projected/6f4ca349-556f-4de4-b23d-b00a59768241-kube-api-access-flnbk\") pod \"6f4ca349-556f-4de4-b23d-b00a59768241\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.819583 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-inventory\") pod \"6f4ca349-556f-4de4-b23d-b00a59768241\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.819682 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-ssh-key\") pod \"6f4ca349-556f-4de4-b23d-b00a59768241\" (UID: \"6f4ca349-556f-4de4-b23d-b00a59768241\") " Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.825824 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4ca349-556f-4de4-b23d-b00a59768241-kube-api-access-flnbk" (OuterVolumeSpecName: "kube-api-access-flnbk") pod "6f4ca349-556f-4de4-b23d-b00a59768241" (UID: "6f4ca349-556f-4de4-b23d-b00a59768241"). InnerVolumeSpecName "kube-api-access-flnbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.873613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-inventory" (OuterVolumeSpecName: "inventory") pod "6f4ca349-556f-4de4-b23d-b00a59768241" (UID: "6f4ca349-556f-4de4-b23d-b00a59768241"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.876415 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6f4ca349-556f-4de4-b23d-b00a59768241" (UID: "6f4ca349-556f-4de4-b23d-b00a59768241"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.922800 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.922846 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6f4ca349-556f-4de4-b23d-b00a59768241-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:25:27 crc kubenswrapper[4707]: I1127 16:25:27.922870 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flnbk\" (UniqueName: \"kubernetes.io/projected/6f4ca349-556f-4de4-b23d-b00a59768241-kube-api-access-flnbk\") on node \"crc\" DevicePath \"\"" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.230843 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" event={"ID":"6f4ca349-556f-4de4-b23d-b00a59768241","Type":"ContainerDied","Data":"eba06005477c4aca6295fb35ba0693bd25ebeb822b93bfc744ae6d3df7f2f9f4"} Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.230897 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eba06005477c4aca6295fb35ba0693bd25ebeb822b93bfc744ae6d3df7f2f9f4" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.230928 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p954g" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.337821 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh"] Nov 27 16:25:28 crc kubenswrapper[4707]: E1127 16:25:28.338585 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4ca349-556f-4de4-b23d-b00a59768241" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.338617 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4ca349-556f-4de4-b23d-b00a59768241" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.338976 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4ca349-556f-4de4-b23d-b00a59768241" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.340077 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.360676 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.361049 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.361346 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.361634 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.386619 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh"] Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.434017 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-kube-api-access-vkrs7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.434093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.434128 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.434187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.536194 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.536287 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-kube-api-access-vkrs7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.536341 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.536376 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.539498 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.539778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.540059 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.563102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-kube-api-access-vkrs7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:28 crc kubenswrapper[4707]: I1127 16:25:28.663834 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:25:29 crc kubenswrapper[4707]: I1127 16:25:29.218537 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh"] Nov 27 16:25:29 crc kubenswrapper[4707]: I1127 16:25:29.245473 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" event={"ID":"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f","Type":"ContainerStarted","Data":"35222f37ca6e532b4bddd02f6e1a766fc4e1fed442f63482bfbd28bf055d5b7d"} Nov 27 16:25:30 crc kubenswrapper[4707]: I1127 16:25:30.256347 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" event={"ID":"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f","Type":"ContainerStarted","Data":"e4a640a22be91e3b1c1ac40c8e2a448ec9a4823f28002aecdf8c42aba2b06e46"} Nov 27 16:25:30 crc kubenswrapper[4707]: I1127 16:25:30.297319 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" podStartSLOduration=1.8480659739999998 podStartE2EDuration="2.297294867s" podCreationTimestamp="2025-11-27 16:25:28 +0000 UTC" firstStartedPulling="2025-11-27 16:25:29.217446795 +0000 UTC m=+1304.848895573" lastFinishedPulling="2025-11-27 16:25:29.666675698 +0000 UTC m=+1305.298124466" observedRunningTime="2025-11-27 16:25:30.278717805 +0000 UTC m=+1305.910166573" watchObservedRunningTime="2025-11-27 16:25:30.297294867 +0000 UTC m=+1305.928743675" Nov 27 16:26:03 crc kubenswrapper[4707]: I1127 16:26:03.623765 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:26:03 crc kubenswrapper[4707]: I1127 16:26:03.624401 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:26:10 crc kubenswrapper[4707]: I1127 16:26:10.606865 4707 scope.go:117] "RemoveContainer" containerID="02800a9fdfba6d6e27d3309db4a1bf5af39a66efb48603760535b9c096826729" Nov 27 16:26:10 crc kubenswrapper[4707]: I1127 16:26:10.638046 4707 scope.go:117] "RemoveContainer" containerID="879c1f13cbf8a83e5d4ee2dcc71894cb59b883cd2f050b66804241c0411b4167" Nov 27 16:26:33 crc kubenswrapper[4707]: I1127 16:26:33.624275 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:26:33 crc kubenswrapper[4707]: I1127 16:26:33.625086 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:27:03 crc kubenswrapper[4707]: I1127 16:27:03.624314 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:27:03 crc kubenswrapper[4707]: I1127 16:27:03.626585 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:27:03 crc kubenswrapper[4707]: I1127 16:27:03.626709 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:27:03 crc kubenswrapper[4707]: I1127 16:27:03.628091 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c7e6d0aacc40003f1cf38f6710c940eb56f156be22a726feeaa22ebce682c5d"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:27:03 crc kubenswrapper[4707]: I1127 16:27:03.628227 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://6c7e6d0aacc40003f1cf38f6710c940eb56f156be22a726feeaa22ebce682c5d" gracePeriod=600 Nov 27 16:27:04 crc kubenswrapper[4707]: I1127 16:27:04.489596 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="6c7e6d0aacc40003f1cf38f6710c940eb56f156be22a726feeaa22ebce682c5d" exitCode=0 Nov 27 16:27:04 crc kubenswrapper[4707]: I1127 16:27:04.489693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"6c7e6d0aacc40003f1cf38f6710c940eb56f156be22a726feeaa22ebce682c5d"} Nov 27 16:27:04 crc kubenswrapper[4707]: I1127 16:27:04.490194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59"} Nov 27 16:27:04 crc kubenswrapper[4707]: I1127 16:27:04.490235 4707 scope.go:117] "RemoveContainer" containerID="a80c79e17bd42a14677ea7ec5718ee1c93082c9c4030211d42f0b9a8e6591e20" Nov 27 16:27:10 crc kubenswrapper[4707]: I1127 16:27:10.754594 4707 scope.go:117] "RemoveContainer" containerID="7d261fcd6ef13459b3a61a1ccfa03a9eb1612fb9e8ff130111490a131872ecca" Nov 27 16:27:10 crc kubenswrapper[4707]: I1127 16:27:10.793536 4707 scope.go:117] "RemoveContainer" containerID="caa589fec436214639fa34128f727e3785d722dfd159296bfe4a33bbbc85191d" Nov 27 16:27:10 crc kubenswrapper[4707]: I1127 16:27:10.837010 4707 scope.go:117] "RemoveContainer" containerID="712557d6b5ee3c01c306eeb1e653fa4c69f6c8eec74fff38fbaa551d2cf27f5c" Nov 27 16:28:11 crc kubenswrapper[4707]: I1127 16:28:11.007672 4707 scope.go:117] "RemoveContainer" containerID="4ab47acaf3bf564824c619ceb7e5ef231b9cb38b5c5c0d098f26ed9e27ffacc3" Nov 27 16:28:11 crc kubenswrapper[4707]: I1127 16:28:11.055757 4707 scope.go:117] "RemoveContainer" containerID="afea46a79746181e6cf61422509dd29eb29e49f31d8630ee5db82a5af9487ddd" Nov 27 16:28:11 crc kubenswrapper[4707]: I1127 16:28:11.083023 4707 scope.go:117] "RemoveContainer" containerID="f6ed0b65fd1bb3f7cfed1acf04eae96d1456431c717f0f39cdfe7f83243b2e4d" Nov 27 16:28:11 crc kubenswrapper[4707]: I1127 16:28:11.116084 4707 scope.go:117] "RemoveContainer" containerID="622e5ed3f0f124984e4d55cbfaa4e258b2bbb08221ab94c92cddb1927c240800" Nov 27 16:28:11 crc kubenswrapper[4707]: I1127 16:28:11.145367 4707 scope.go:117] "RemoveContainer" containerID="055310bb79129fbd5f9e58df2876727ead3143c79299861fb6c74bacf0d6a460" Nov 27 16:28:11 crc kubenswrapper[4707]: I1127 16:28:11.182503 4707 scope.go:117] "RemoveContainer" containerID="21d8e8ec5b44e44701f27c4c99a16782f4f6ec9bf3b2151993581016bdaa83ba" Nov 27 16:28:11 crc kubenswrapper[4707]: I1127 16:28:11.230133 4707 scope.go:117] "RemoveContainer" containerID="4ee81a19f1bfdf194cc1b3f636355dd46362f92fc02a5d961d4327980003b8ab" Nov 27 16:28:46 crc kubenswrapper[4707]: I1127 16:28:46.772659 4707 generic.go:334] "Generic (PLEG): container finished" podID="a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f" containerID="e4a640a22be91e3b1c1ac40c8e2a448ec9a4823f28002aecdf8c42aba2b06e46" exitCode=0 Nov 27 16:28:46 crc kubenswrapper[4707]: I1127 16:28:46.772765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" event={"ID":"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f","Type":"ContainerDied","Data":"e4a640a22be91e3b1c1ac40c8e2a448ec9a4823f28002aecdf8c42aba2b06e46"} Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.158959 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.329221 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-kube-api-access-vkrs7\") pod \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.329513 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-inventory\") pod \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.329557 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-ssh-key\") pod \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.329730 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-bootstrap-combined-ca-bundle\") pod \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\" (UID: \"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f\") " Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.346187 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f" (UID: "a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.346425 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-kube-api-access-vkrs7" (OuterVolumeSpecName: "kube-api-access-vkrs7") pod "a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f" (UID: "a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f"). InnerVolumeSpecName "kube-api-access-vkrs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.371806 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f" (UID: "a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.385365 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-inventory" (OuterVolumeSpecName: "inventory") pod "a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f" (UID: "a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.439562 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.439609 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-kube-api-access-vkrs7\") on node \"crc\" DevicePath \"\"" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.439628 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.439646 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.793406 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" event={"ID":"a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f","Type":"ContainerDied","Data":"35222f37ca6e532b4bddd02f6e1a766fc4e1fed442f63482bfbd28bf055d5b7d"} Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.793453 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35222f37ca6e532b4bddd02f6e1a766fc4e1fed442f63482bfbd28bf055d5b7d" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.793609 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.908094 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl"] Nov 27 16:28:48 crc kubenswrapper[4707]: E1127 16:28:48.908526 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.908549 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.908789 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.909607 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.913223 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.914296 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.914388 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.914517 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:28:48 crc kubenswrapper[4707]: I1127 16:28:48.917579 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl"] Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.050792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.051356 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.051687 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npps7\" (UniqueName: \"kubernetes.io/projected/540a037b-eddb-4f11-8ed0-209cebfc0ee1-kube-api-access-npps7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.153713 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.153782 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.153879 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npps7\" (UniqueName: \"kubernetes.io/projected/540a037b-eddb-4f11-8ed0-209cebfc0ee1-kube-api-access-npps7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.160814 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.166896 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.181609 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npps7\" (UniqueName: \"kubernetes.io/projected/540a037b-eddb-4f11-8ed0-209cebfc0ee1-kube-api-access-npps7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.240943 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:28:49 crc kubenswrapper[4707]: I1127 16:28:49.809307 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl"] Nov 27 16:28:49 crc kubenswrapper[4707]: W1127 16:28:49.817613 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod540a037b_eddb_4f11_8ed0_209cebfc0ee1.slice/crio-a0dda95b299d715e595a4c5ffaf58077c1be16b02954e376ce4ddca7efa105f1 WatchSource:0}: Error finding container a0dda95b299d715e595a4c5ffaf58077c1be16b02954e376ce4ddca7efa105f1: Status 404 returned error can't find the container with id a0dda95b299d715e595a4c5ffaf58077c1be16b02954e376ce4ddca7efa105f1 Nov 27 16:28:50 crc kubenswrapper[4707]: I1127 16:28:50.810565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" event={"ID":"540a037b-eddb-4f11-8ed0-209cebfc0ee1","Type":"ContainerStarted","Data":"60e193ade138067a07fda6ddebb4c390e5399d923e1693621a2a087d60d45d7e"} Nov 27 16:28:50 crc kubenswrapper[4707]: I1127 16:28:50.810928 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" event={"ID":"540a037b-eddb-4f11-8ed0-209cebfc0ee1","Type":"ContainerStarted","Data":"a0dda95b299d715e595a4c5ffaf58077c1be16b02954e376ce4ddca7efa105f1"} Nov 27 16:28:50 crc kubenswrapper[4707]: I1127 16:28:50.834319 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" podStartSLOduration=2.254945518 podStartE2EDuration="2.834302335s" podCreationTimestamp="2025-11-27 16:28:48 +0000 UTC" firstStartedPulling="2025-11-27 16:28:49.81951084 +0000 UTC m=+1505.450959608" lastFinishedPulling="2025-11-27 16:28:50.398867657 +0000 UTC m=+1506.030316425" observedRunningTime="2025-11-27 16:28:50.831045285 +0000 UTC m=+1506.462494063" watchObservedRunningTime="2025-11-27 16:28:50.834302335 +0000 UTC m=+1506.465751103" Nov 27 16:29:11 crc kubenswrapper[4707]: I1127 16:29:11.348607 4707 scope.go:117] "RemoveContainer" containerID="2d506a98aef89c7aa4420965fcb0a6314d9b8710484a1f60f42596ee04be0983" Nov 27 16:29:11 crc kubenswrapper[4707]: I1127 16:29:11.378828 4707 scope.go:117] "RemoveContainer" containerID="eeaf1d3be99ba07fc12759afdd6dc44013ba560cc093c6ca1634b8dd4174ed1e" Nov 27 16:29:11 crc kubenswrapper[4707]: I1127 16:29:11.445711 4707 scope.go:117] "RemoveContainer" containerID="850ffd2ac4f6fc750a1234dc25593a820b170122351d923e194369828bba3f32" Nov 27 16:29:11 crc kubenswrapper[4707]: I1127 16:29:11.495816 4707 scope.go:117] "RemoveContainer" containerID="b203f67cd1cbf3949f13ad1cbceb6aaa0cd33cca5ba912d48e685e6d831542d0" Nov 27 16:29:33 crc kubenswrapper[4707]: I1127 16:29:33.625622 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:29:33 crc kubenswrapper[4707]: I1127 16:29:33.626260 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.078530 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-dr4hp"] Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.088487 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9dglp"] Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.098179 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b279-account-create-update-ndbrb"] Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.109649 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8a5e-account-create-update-knzqk"] Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.118050 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9dglp"] Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.126689 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-dr4hp"] Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.134845 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8a5e-account-create-update-knzqk"] Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.142982 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b279-account-create-update-ndbrb"] Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.211809 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="252b2a8c-9a8b-4883-9ae7-27fee7760d5f" path="/var/lib/kubelet/pods/252b2a8c-9a8b-4883-9ae7-27fee7760d5f/volumes" Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.213354 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5ca70a-f09d-4baf-a71e-21c7aeefb867" path="/var/lib/kubelet/pods/4f5ca70a-f09d-4baf-a71e-21c7aeefb867/volumes" Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.214690 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5844334a-ffad-42d6-a2cb-b714fe85f90f" path="/var/lib/kubelet/pods/5844334a-ffad-42d6-a2cb-b714fe85f90f/volumes" Nov 27 16:29:41 crc kubenswrapper[4707]: I1127 16:29:41.215837 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dcab406-6c2d-497a-aad3-86162558c506" path="/var/lib/kubelet/pods/8dcab406-6c2d-497a-aad3-86162558c506/volumes" Nov 27 16:29:44 crc kubenswrapper[4707]: I1127 16:29:44.029037 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4g7sb"] Nov 27 16:29:44 crc kubenswrapper[4707]: I1127 16:29:44.037151 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4g7sb"] Nov 27 16:29:45 crc kubenswrapper[4707]: I1127 16:29:45.030536 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1a8e-account-create-update-qhr8n"] Nov 27 16:29:45 crc kubenswrapper[4707]: I1127 16:29:45.043031 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1a8e-account-create-update-qhr8n"] Nov 27 16:29:45 crc kubenswrapper[4707]: I1127 16:29:45.211201 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62d0aaa3-bc36-4408-ae01-14d362740486" path="/var/lib/kubelet/pods/62d0aaa3-bc36-4408-ae01-14d362740486/volumes" Nov 27 16:29:45 crc kubenswrapper[4707]: I1127 16:29:45.211980 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62dcf83c-a8a2-4362-b412-8243d19bd711" path="/var/lib/kubelet/pods/62dcf83c-a8a2-4362-b412-8243d19bd711/volumes" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.165341 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp"] Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.168801 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.171440 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.171932 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.177528 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp"] Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.331266 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/834bfdd5-e6b0-4f86-b805-867827ea250e-secret-volume\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.331447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/834bfdd5-e6b0-4f86-b805-867827ea250e-config-volume\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.332043 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq8n7\" (UniqueName: \"kubernetes.io/projected/834bfdd5-e6b0-4f86-b805-867827ea250e-kube-api-access-lq8n7\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.434076 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq8n7\" (UniqueName: \"kubernetes.io/projected/834bfdd5-e6b0-4f86-b805-867827ea250e-kube-api-access-lq8n7\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.434136 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/834bfdd5-e6b0-4f86-b805-867827ea250e-secret-volume\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.434223 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/834bfdd5-e6b0-4f86-b805-867827ea250e-config-volume\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.435060 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/834bfdd5-e6b0-4f86-b805-867827ea250e-config-volume\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.441392 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/834bfdd5-e6b0-4f86-b805-867827ea250e-secret-volume\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.451603 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq8n7\" (UniqueName: \"kubernetes.io/projected/834bfdd5-e6b0-4f86-b805-867827ea250e-kube-api-access-lq8n7\") pod \"collect-profiles-29404350-t6ghp\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:00 crc kubenswrapper[4707]: I1127 16:30:00.495478 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:01 crc kubenswrapper[4707]: I1127 16:30:01.004896 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp"] Nov 27 16:30:01 crc kubenswrapper[4707]: I1127 16:30:01.677634 4707 generic.go:334] "Generic (PLEG): container finished" podID="834bfdd5-e6b0-4f86-b805-867827ea250e" containerID="0120f09f12a9a10173c2fabe5c119a649d640afbb06cd8614a7ce6fd9e15315b" exitCode=0 Nov 27 16:30:01 crc kubenswrapper[4707]: I1127 16:30:01.677713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" event={"ID":"834bfdd5-e6b0-4f86-b805-867827ea250e","Type":"ContainerDied","Data":"0120f09f12a9a10173c2fabe5c119a649d640afbb06cd8614a7ce6fd9e15315b"} Nov 27 16:30:01 crc kubenswrapper[4707]: I1127 16:30:01.678157 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" event={"ID":"834bfdd5-e6b0-4f86-b805-867827ea250e","Type":"ContainerStarted","Data":"0b93ab7a4b191c45bcfec2797805f27afe1cc3670230b00873ef3986462ab612"} Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.139409 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.338395 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/834bfdd5-e6b0-4f86-b805-867827ea250e-secret-volume\") pod \"834bfdd5-e6b0-4f86-b805-867827ea250e\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.338481 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/834bfdd5-e6b0-4f86-b805-867827ea250e-config-volume\") pod \"834bfdd5-e6b0-4f86-b805-867827ea250e\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.338504 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq8n7\" (UniqueName: \"kubernetes.io/projected/834bfdd5-e6b0-4f86-b805-867827ea250e-kube-api-access-lq8n7\") pod \"834bfdd5-e6b0-4f86-b805-867827ea250e\" (UID: \"834bfdd5-e6b0-4f86-b805-867827ea250e\") " Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.338980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/834bfdd5-e6b0-4f86-b805-867827ea250e-config-volume" (OuterVolumeSpecName: "config-volume") pod "834bfdd5-e6b0-4f86-b805-867827ea250e" (UID: "834bfdd5-e6b0-4f86-b805-867827ea250e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.339614 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/834bfdd5-e6b0-4f86-b805-867827ea250e-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.352689 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834bfdd5-e6b0-4f86-b805-867827ea250e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "834bfdd5-e6b0-4f86-b805-867827ea250e" (UID: "834bfdd5-e6b0-4f86-b805-867827ea250e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.352830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834bfdd5-e6b0-4f86-b805-867827ea250e-kube-api-access-lq8n7" (OuterVolumeSpecName: "kube-api-access-lq8n7") pod "834bfdd5-e6b0-4f86-b805-867827ea250e" (UID: "834bfdd5-e6b0-4f86-b805-867827ea250e"). InnerVolumeSpecName "kube-api-access-lq8n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.441262 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/834bfdd5-e6b0-4f86-b805-867827ea250e-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.441300 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq8n7\" (UniqueName: \"kubernetes.io/projected/834bfdd5-e6b0-4f86-b805-867827ea250e-kube-api-access-lq8n7\") on node \"crc\" DevicePath \"\"" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.623555 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.623628 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.704703 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" event={"ID":"834bfdd5-e6b0-4f86-b805-867827ea250e","Type":"ContainerDied","Data":"0b93ab7a4b191c45bcfec2797805f27afe1cc3670230b00873ef3986462ab612"} Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.704769 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b93ab7a4b191c45bcfec2797805f27afe1cc3670230b00873ef3986462ab612" Nov 27 16:30:03 crc kubenswrapper[4707]: I1127 16:30:03.704777 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp" Nov 27 16:30:09 crc kubenswrapper[4707]: I1127 16:30:09.047893 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7757b"] Nov 27 16:30:09 crc kubenswrapper[4707]: I1127 16:30:09.066763 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7757b"] Nov 27 16:30:09 crc kubenswrapper[4707]: I1127 16:30:09.232358 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92423d7b-7233-418c-b1f2-5516a6c7c2a3" path="/var/lib/kubelet/pods/92423d7b-7233-418c-b1f2-5516a6c7c2a3/volumes" Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.093397 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-j98v9"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.108295 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h4wdg"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.119812 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6f3e-account-create-update-95dd4"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.130668 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-336f-account-create-update-gtvgd"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.139672 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b0e5-account-create-update-k86pp"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.149664 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a71a-account-create-update-tqs56"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.157701 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-74xmp"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.164044 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h4wdg"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.171613 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6f3e-account-create-update-95dd4"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.213818 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-336f-account-create-update-gtvgd"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.231559 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-j98v9"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.269538 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a71a-account-create-update-tqs56"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.280593 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-74xmp"] Nov 27 16:30:10 crc kubenswrapper[4707]: I1127 16:30:10.306266 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b0e5-account-create-update-k86pp"] Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.207893 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24172997-039c-47c2-b908-087dec03273f" path="/var/lib/kubelet/pods/24172997-039c-47c2-b908-087dec03273f/volumes" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.208904 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d232067-8a9f-4c42-a934-60560ad7d65c" path="/var/lib/kubelet/pods/2d232067-8a9f-4c42-a934-60560ad7d65c/volumes" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.209581 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb" path="/var/lib/kubelet/pods/5bef91e6-e5d4-4bef-bab3-f3c3239e8cbb/volumes" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.210287 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a48402-5e5b-4427-b4f3-d28d7eb0e61e" path="/var/lib/kubelet/pods/82a48402-5e5b-4427-b4f3-d28d7eb0e61e/volumes" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.211592 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fcab9a-5390-407b-afe4-753ec3be0120" path="/var/lib/kubelet/pods/a7fcab9a-5390-407b-afe4-753ec3be0120/volumes" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.212231 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7891f1-2643-4a2a-938b-80c0f25dac7c" path="/var/lib/kubelet/pods/ed7891f1-2643-4a2a-938b-80c0f25dac7c/volumes" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.212878 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b458d9-3955-421a-bfa6-30fca174692a" path="/var/lib/kubelet/pods/f0b458d9-3955-421a-bfa6-30fca174692a/volumes" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.563982 4707 scope.go:117] "RemoveContainer" containerID="5c6ab1ed28c867d7cd8dc1309c3bdd74c8abca787c49cf0f050d4af0806195b9" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.593904 4707 scope.go:117] "RemoveContainer" containerID="c4c452ab3786d1b744f5004cfd9c13fa3ab7ef5a04611203494ecd7d442544e1" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.677310 4707 scope.go:117] "RemoveContainer" containerID="ca393106b78acc7aea102559bc24dc04a67e20c03b0fdf075acac63412827afd" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.737622 4707 scope.go:117] "RemoveContainer" containerID="85b3642d337b8c6304faa955e1fc471d88aecc0647076f13e6822c3bed85ebd8" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.796019 4707 scope.go:117] "RemoveContainer" containerID="0c77e1e676d1eda2676c31f70f9fa0e22adc040d185d4a1eb3328f049783e768" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.836357 4707 scope.go:117] "RemoveContainer" containerID="3579e385d9a10937a03ea4bc515cf6b23a2e7a9f2485752294c492450ea8dd58" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.870577 4707 scope.go:117] "RemoveContainer" containerID="9fd33f95f4dd2b21d7cb3472a03f5e3aacded216d66201c5195eac855a4eceea" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.888796 4707 scope.go:117] "RemoveContainer" containerID="507c61a4ac7e9bb2559135e7ef3b743a40b7c972c7d32f87c7cd7db9c9a5d70b" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.912118 4707 scope.go:117] "RemoveContainer" containerID="8391a045d787fbef1ee1469d40cf848820d1aee0bcb66251f8664f5d9ff77e8b" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.931270 4707 scope.go:117] "RemoveContainer" containerID="ebbdaa5f9b26cac2e63eed7ff9d651633ee19f0ea2720c58ef299eba83859bd5" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.969461 4707 scope.go:117] "RemoveContainer" containerID="01234a10d3b06021338fa24e1eb201a928dace72113bfba0d7e0f34ba6abcb40" Nov 27 16:30:11 crc kubenswrapper[4707]: I1127 16:30:11.997965 4707 scope.go:117] "RemoveContainer" containerID="d1d9eeabeb86b57cf9580ba9008e25b2c8f38628b857cac062a81dc81853676f" Nov 27 16:30:12 crc kubenswrapper[4707]: I1127 16:30:12.028746 4707 scope.go:117] "RemoveContainer" containerID="7f7e1053888c9270e40a5f69e0f5023906503e0c60ed7a7b58ae3cc7e7f935f2" Nov 27 16:30:12 crc kubenswrapper[4707]: I1127 16:30:12.054212 4707 scope.go:117] "RemoveContainer" containerID="02af4020f961083759e82e6bf1c4afc8f5d8931d14de9d0fe97a32a4c76df59d" Nov 27 16:30:16 crc kubenswrapper[4707]: I1127 16:30:16.053043 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-kdv4k"] Nov 27 16:30:16 crc kubenswrapper[4707]: I1127 16:30:16.066988 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-kdv4k"] Nov 27 16:30:17 crc kubenswrapper[4707]: I1127 16:30:17.216242 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09fb5cc3-1c55-459c-aa89-63f13e8f97f4" path="/var/lib/kubelet/pods/09fb5cc3-1c55-459c-aa89-63f13e8f97f4/volumes" Nov 27 16:30:19 crc kubenswrapper[4707]: I1127 16:30:19.044470 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ktrrp"] Nov 27 16:30:19 crc kubenswrapper[4707]: I1127 16:30:19.063732 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ktrrp"] Nov 27 16:30:19 crc kubenswrapper[4707]: I1127 16:30:19.215938 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c5bb46-81d2-4478-8a44-42c46ddaaffa" path="/var/lib/kubelet/pods/a6c5bb46-81d2-4478-8a44-42c46ddaaffa/volumes" Nov 27 16:30:33 crc kubenswrapper[4707]: I1127 16:30:33.623413 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:30:33 crc kubenswrapper[4707]: I1127 16:30:33.624071 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:30:33 crc kubenswrapper[4707]: I1127 16:30:33.624132 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:30:33 crc kubenswrapper[4707]: I1127 16:30:33.625498 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:30:33 crc kubenswrapper[4707]: I1127 16:30:33.625605 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" gracePeriod=600 Nov 27 16:30:33 crc kubenswrapper[4707]: E1127 16:30:33.745088 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:30:34 crc kubenswrapper[4707]: I1127 16:30:34.020434 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" exitCode=0 Nov 27 16:30:34 crc kubenswrapper[4707]: I1127 16:30:34.020483 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59"} Nov 27 16:30:34 crc kubenswrapper[4707]: I1127 16:30:34.020520 4707 scope.go:117] "RemoveContainer" containerID="6c7e6d0aacc40003f1cf38f6710c940eb56f156be22a726feeaa22ebce682c5d" Nov 27 16:30:34 crc kubenswrapper[4707]: I1127 16:30:34.021189 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:30:34 crc kubenswrapper[4707]: E1127 16:30:34.021517 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:30:35 crc kubenswrapper[4707]: I1127 16:30:35.034773 4707 generic.go:334] "Generic (PLEG): container finished" podID="540a037b-eddb-4f11-8ed0-209cebfc0ee1" containerID="60e193ade138067a07fda6ddebb4c390e5399d923e1693621a2a087d60d45d7e" exitCode=0 Nov 27 16:30:35 crc kubenswrapper[4707]: I1127 16:30:35.034846 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" event={"ID":"540a037b-eddb-4f11-8ed0-209cebfc0ee1","Type":"ContainerDied","Data":"60e193ade138067a07fda6ddebb4c390e5399d923e1693621a2a087d60d45d7e"} Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.606210 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.744767 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-inventory\") pod \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.744897 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-ssh-key\") pod \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.745702 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npps7\" (UniqueName: \"kubernetes.io/projected/540a037b-eddb-4f11-8ed0-209cebfc0ee1-kube-api-access-npps7\") pod \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\" (UID: \"540a037b-eddb-4f11-8ed0-209cebfc0ee1\") " Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.760609 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/540a037b-eddb-4f11-8ed0-209cebfc0ee1-kube-api-access-npps7" (OuterVolumeSpecName: "kube-api-access-npps7") pod "540a037b-eddb-4f11-8ed0-209cebfc0ee1" (UID: "540a037b-eddb-4f11-8ed0-209cebfc0ee1"). InnerVolumeSpecName "kube-api-access-npps7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.788086 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-inventory" (OuterVolumeSpecName: "inventory") pod "540a037b-eddb-4f11-8ed0-209cebfc0ee1" (UID: "540a037b-eddb-4f11-8ed0-209cebfc0ee1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.799551 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "540a037b-eddb-4f11-8ed0-209cebfc0ee1" (UID: "540a037b-eddb-4f11-8ed0-209cebfc0ee1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.847449 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.847492 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/540a037b-eddb-4f11-8ed0-209cebfc0ee1-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:30:36 crc kubenswrapper[4707]: I1127 16:30:36.847508 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npps7\" (UniqueName: \"kubernetes.io/projected/540a037b-eddb-4f11-8ed0-209cebfc0ee1-kube-api-access-npps7\") on node \"crc\" DevicePath \"\"" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.057842 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" event={"ID":"540a037b-eddb-4f11-8ed0-209cebfc0ee1","Type":"ContainerDied","Data":"a0dda95b299d715e595a4c5ffaf58077c1be16b02954e376ce4ddca7efa105f1"} Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.057884 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0dda95b299d715e595a4c5ffaf58077c1be16b02954e376ce4ddca7efa105f1" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.057945 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.158885 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs"] Nov 27 16:30:37 crc kubenswrapper[4707]: E1127 16:30:37.159917 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834bfdd5-e6b0-4f86-b805-867827ea250e" containerName="collect-profiles" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.160035 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="834bfdd5-e6b0-4f86-b805-867827ea250e" containerName="collect-profiles" Nov 27 16:30:37 crc kubenswrapper[4707]: E1127 16:30:37.160147 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="540a037b-eddb-4f11-8ed0-209cebfc0ee1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.160242 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="540a037b-eddb-4f11-8ed0-209cebfc0ee1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.160598 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="540a037b-eddb-4f11-8ed0-209cebfc0ee1" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.160716 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="834bfdd5-e6b0-4f86-b805-867827ea250e" containerName="collect-profiles" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.161652 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.163705 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.164694 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.164961 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.167653 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.167704 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs"] Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.256175 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gw5l\" (UniqueName: \"kubernetes.io/projected/86c589a6-19e5-48cc-8db8-42af5ae0f078-kube-api-access-6gw5l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.256244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.256296 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.358036 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gw5l\" (UniqueName: \"kubernetes.io/projected/86c589a6-19e5-48cc-8db8-42af5ae0f078-kube-api-access-6gw5l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.358689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.358915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.364733 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.364949 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.387162 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gw5l\" (UniqueName: \"kubernetes.io/projected/86c589a6-19e5-48cc-8db8-42af5ae0f078-kube-api-access-6gw5l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:37 crc kubenswrapper[4707]: I1127 16:30:37.489545 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:30:38 crc kubenswrapper[4707]: I1127 16:30:38.167210 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs"] Nov 27 16:30:38 crc kubenswrapper[4707]: I1127 16:30:38.172521 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:30:39 crc kubenswrapper[4707]: I1127 16:30:39.079664 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" event={"ID":"86c589a6-19e5-48cc-8db8-42af5ae0f078","Type":"ContainerStarted","Data":"c7a06a1be2e990892a937af57c69e537f274257613a967627a7d1f2127560b16"} Nov 27 16:30:40 crc kubenswrapper[4707]: I1127 16:30:40.093937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" event={"ID":"86c589a6-19e5-48cc-8db8-42af5ae0f078","Type":"ContainerStarted","Data":"3aebad07869fea6d53f94450a3ab39494a24f52ddf279465442591074b413a66"} Nov 27 16:30:40 crc kubenswrapper[4707]: I1127 16:30:40.112170 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" podStartSLOduration=2.373599742 podStartE2EDuration="3.112153792s" podCreationTimestamp="2025-11-27 16:30:37 +0000 UTC" firstStartedPulling="2025-11-27 16:30:38.17201611 +0000 UTC m=+1613.803464898" lastFinishedPulling="2025-11-27 16:30:38.91057018 +0000 UTC m=+1614.542018948" observedRunningTime="2025-11-27 16:30:40.112088021 +0000 UTC m=+1615.743536829" watchObservedRunningTime="2025-11-27 16:30:40.112153792 +0000 UTC m=+1615.743602560" Nov 27 16:30:47 crc kubenswrapper[4707]: I1127 16:30:47.195743 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:30:47 crc kubenswrapper[4707]: E1127 16:30:47.196809 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:30:51 crc kubenswrapper[4707]: I1127 16:30:51.071463 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ftcdx"] Nov 27 16:30:51 crc kubenswrapper[4707]: I1127 16:30:51.096621 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ftcdx"] Nov 27 16:30:51 crc kubenswrapper[4707]: I1127 16:30:51.207607 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe6108f-9182-4d5b-b877-977da419fc7c" path="/var/lib/kubelet/pods/ffe6108f-9182-4d5b-b877-977da419fc7c/volumes" Nov 27 16:30:57 crc kubenswrapper[4707]: I1127 16:30:57.046782 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-qzltn"] Nov 27 16:30:57 crc kubenswrapper[4707]: I1127 16:30:57.060269 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fh47h"] Nov 27 16:30:57 crc kubenswrapper[4707]: I1127 16:30:57.075966 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fh47h"] Nov 27 16:30:57 crc kubenswrapper[4707]: I1127 16:30:57.086351 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-qzltn"] Nov 27 16:30:57 crc kubenswrapper[4707]: I1127 16:30:57.210857 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4db357ae-5dd8-47c3-8f13-fd888df4fd42" path="/var/lib/kubelet/pods/4db357ae-5dd8-47c3-8f13-fd888df4fd42/volumes" Nov 27 16:30:57 crc kubenswrapper[4707]: I1127 16:30:57.211431 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="525c317f-60e5-4359-bdd4-62caf9f54b38" path="/var/lib/kubelet/pods/525c317f-60e5-4359-bdd4-62caf9f54b38/volumes" Nov 27 16:31:01 crc kubenswrapper[4707]: I1127 16:31:01.195061 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:31:01 crc kubenswrapper[4707]: E1127 16:31:01.195905 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:31:03 crc kubenswrapper[4707]: I1127 16:31:03.823150 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzxx2"] Nov 27 16:31:03 crc kubenswrapper[4707]: I1127 16:31:03.868812 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:03 crc kubenswrapper[4707]: I1127 16:31:03.901728 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzxx2"] Nov 27 16:31:03 crc kubenswrapper[4707]: I1127 16:31:03.910400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgctk\" (UniqueName: \"kubernetes.io/projected/0217b698-ad65-433c-8895-a21db1b6ca3d-kube-api-access-lgctk\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:03 crc kubenswrapper[4707]: I1127 16:31:03.910540 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-utilities\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:03 crc kubenswrapper[4707]: I1127 16:31:03.910642 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-catalog-content\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:04 crc kubenswrapper[4707]: I1127 16:31:04.015533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-catalog-content\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:04 crc kubenswrapper[4707]: I1127 16:31:04.015730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgctk\" (UniqueName: \"kubernetes.io/projected/0217b698-ad65-433c-8895-a21db1b6ca3d-kube-api-access-lgctk\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:04 crc kubenswrapper[4707]: I1127 16:31:04.015808 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-utilities\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:04 crc kubenswrapper[4707]: I1127 16:31:04.016195 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-catalog-content\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:04 crc kubenswrapper[4707]: I1127 16:31:04.016414 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-utilities\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:04 crc kubenswrapper[4707]: I1127 16:31:04.039236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgctk\" (UniqueName: \"kubernetes.io/projected/0217b698-ad65-433c-8895-a21db1b6ca3d-kube-api-access-lgctk\") pod \"redhat-operators-rzxx2\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:04 crc kubenswrapper[4707]: I1127 16:31:04.198969 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:05 crc kubenswrapper[4707]: I1127 16:31:05.079157 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzxx2"] Nov 27 16:31:05 crc kubenswrapper[4707]: I1127 16:31:05.371642 4707 generic.go:334] "Generic (PLEG): container finished" podID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerID="c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f" exitCode=0 Nov 27 16:31:05 crc kubenswrapper[4707]: I1127 16:31:05.371900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzxx2" event={"ID":"0217b698-ad65-433c-8895-a21db1b6ca3d","Type":"ContainerDied","Data":"c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f"} Nov 27 16:31:05 crc kubenswrapper[4707]: I1127 16:31:05.371973 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzxx2" event={"ID":"0217b698-ad65-433c-8895-a21db1b6ca3d","Type":"ContainerStarted","Data":"4da2fd4f4b00d9872d65689a0fd64a13af002b7da09c5acaa402c625b08fd013"} Nov 27 16:31:06 crc kubenswrapper[4707]: I1127 16:31:06.385486 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzxx2" event={"ID":"0217b698-ad65-433c-8895-a21db1b6ca3d","Type":"ContainerStarted","Data":"e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b"} Nov 27 16:31:07 crc kubenswrapper[4707]: I1127 16:31:07.049196 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hmbzj"] Nov 27 16:31:07 crc kubenswrapper[4707]: I1127 16:31:07.065225 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hmbzj"] Nov 27 16:31:07 crc kubenswrapper[4707]: I1127 16:31:07.210126 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6661447e-fc28-4c9a-bcd1-66e15b0ca3fd" path="/var/lib/kubelet/pods/6661447e-fc28-4c9a-bcd1-66e15b0ca3fd/volumes" Nov 27 16:31:08 crc kubenswrapper[4707]: I1127 16:31:08.413873 4707 generic.go:334] "Generic (PLEG): container finished" podID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerID="e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b" exitCode=0 Nov 27 16:31:08 crc kubenswrapper[4707]: I1127 16:31:08.413964 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzxx2" event={"ID":"0217b698-ad65-433c-8895-a21db1b6ca3d","Type":"ContainerDied","Data":"e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b"} Nov 27 16:31:10 crc kubenswrapper[4707]: I1127 16:31:10.443136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzxx2" event={"ID":"0217b698-ad65-433c-8895-a21db1b6ca3d","Type":"ContainerStarted","Data":"fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c"} Nov 27 16:31:10 crc kubenswrapper[4707]: I1127 16:31:10.473018 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzxx2" podStartSLOduration=3.3906293290000002 podStartE2EDuration="7.472990546s" podCreationTimestamp="2025-11-27 16:31:03 +0000 UTC" firstStartedPulling="2025-11-27 16:31:05.374165442 +0000 UTC m=+1641.005614210" lastFinishedPulling="2025-11-27 16:31:09.456526659 +0000 UTC m=+1645.087975427" observedRunningTime="2025-11-27 16:31:10.465242736 +0000 UTC m=+1646.096691504" watchObservedRunningTime="2025-11-27 16:31:10.472990546 +0000 UTC m=+1646.104439354" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.716351 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cknzd"] Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.719008 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.755526 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cknzd"] Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.888422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w9ll\" (UniqueName: \"kubernetes.io/projected/78d0125e-40cb-4838-a127-d5626de4eff4-kube-api-access-8w9ll\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.888506 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-utilities\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.888636 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-catalog-content\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.990973 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w9ll\" (UniqueName: \"kubernetes.io/projected/78d0125e-40cb-4838-a127-d5626de4eff4-kube-api-access-8w9ll\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.991092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-utilities\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.991270 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-catalog-content\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.991718 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-utilities\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:11 crc kubenswrapper[4707]: I1127 16:31:11.991783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-catalog-content\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.023061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w9ll\" (UniqueName: \"kubernetes.io/projected/78d0125e-40cb-4838-a127-d5626de4eff4-kube-api-access-8w9ll\") pod \"redhat-marketplace-cknzd\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.052703 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.196947 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:31:12 crc kubenswrapper[4707]: E1127 16:31:12.197700 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.354251 4707 scope.go:117] "RemoveContainer" containerID="4cebbc21162ddd05ea2b220c368d5661cfb0b3e37d8da5c199a0c260c2bd13f1" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.390094 4707 scope.go:117] "RemoveContainer" containerID="60a448fb7d203b6dd28abfaf4bdcaa8d82a224071ef2206cfa3fc5917f6c6ebf" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.442754 4707 scope.go:117] "RemoveContainer" containerID="4abe399b9e6ebf2086362330961b71743ba3799fce812e482717eaa3ffa4e1b2" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.472698 4707 scope.go:117] "RemoveContainer" containerID="672c7b06415373aac80def8df626058032b536c97c6101380abddd2763491cbe" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.493499 4707 scope.go:117] "RemoveContainer" containerID="550123f8283cb9092f7db11db6ceecbd9443975883983e2c8b5d913c7e618bd1" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.556426 4707 scope.go:117] "RemoveContainer" containerID="4ee16c740caac328bd7fff69ecf2e731609e164c15d991981d2a81d2b9d40c1c" Nov 27 16:31:12 crc kubenswrapper[4707]: I1127 16:31:12.577186 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cknzd"] Nov 27 16:31:12 crc kubenswrapper[4707]: W1127 16:31:12.589235 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d0125e_40cb_4838_a127_d5626de4eff4.slice/crio-bb042675eb150a9c353b13468f4ce2520f5d88c09041b229db7a2656f483fd5b WatchSource:0}: Error finding container bb042675eb150a9c353b13468f4ce2520f5d88c09041b229db7a2656f483fd5b: Status 404 returned error can't find the container with id bb042675eb150a9c353b13468f4ce2520f5d88c09041b229db7a2656f483fd5b Nov 27 16:31:13 crc kubenswrapper[4707]: I1127 16:31:13.067215 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-5k4zv"] Nov 27 16:31:13 crc kubenswrapper[4707]: I1127 16:31:13.081985 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-5k4zv"] Nov 27 16:31:13 crc kubenswrapper[4707]: I1127 16:31:13.220747 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c445b1d-7e63-48cc-83ee-c4841074701c" path="/var/lib/kubelet/pods/1c445b1d-7e63-48cc-83ee-c4841074701c/volumes" Nov 27 16:31:13 crc kubenswrapper[4707]: I1127 16:31:13.506068 4707 generic.go:334] "Generic (PLEG): container finished" podID="78d0125e-40cb-4838-a127-d5626de4eff4" containerID="a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4" exitCode=0 Nov 27 16:31:13 crc kubenswrapper[4707]: I1127 16:31:13.506113 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cknzd" event={"ID":"78d0125e-40cb-4838-a127-d5626de4eff4","Type":"ContainerDied","Data":"a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4"} Nov 27 16:31:13 crc kubenswrapper[4707]: I1127 16:31:13.506136 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cknzd" event={"ID":"78d0125e-40cb-4838-a127-d5626de4eff4","Type":"ContainerStarted","Data":"bb042675eb150a9c353b13468f4ce2520f5d88c09041b229db7a2656f483fd5b"} Nov 27 16:31:14 crc kubenswrapper[4707]: I1127 16:31:14.037038 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gb8rd"] Nov 27 16:31:14 crc kubenswrapper[4707]: I1127 16:31:14.053363 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gb8rd"] Nov 27 16:31:14 crc kubenswrapper[4707]: I1127 16:31:14.200266 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:14 crc kubenswrapper[4707]: I1127 16:31:14.200731 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:15 crc kubenswrapper[4707]: I1127 16:31:15.216342 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b8ce5ef-c2af-4e15-9677-8e878b96c4de" path="/var/lib/kubelet/pods/1b8ce5ef-c2af-4e15-9677-8e878b96c4de/volumes" Nov 27 16:31:15 crc kubenswrapper[4707]: I1127 16:31:15.254909 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzxx2" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="registry-server" probeResult="failure" output=< Nov 27 16:31:15 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Nov 27 16:31:15 crc kubenswrapper[4707]: > Nov 27 16:31:15 crc kubenswrapper[4707]: I1127 16:31:15.533940 4707 generic.go:334] "Generic (PLEG): container finished" podID="78d0125e-40cb-4838-a127-d5626de4eff4" containerID="a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6" exitCode=0 Nov 27 16:31:15 crc kubenswrapper[4707]: I1127 16:31:15.534068 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cknzd" event={"ID":"78d0125e-40cb-4838-a127-d5626de4eff4","Type":"ContainerDied","Data":"a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6"} Nov 27 16:31:16 crc kubenswrapper[4707]: I1127 16:31:16.548632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cknzd" event={"ID":"78d0125e-40cb-4838-a127-d5626de4eff4","Type":"ContainerStarted","Data":"5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd"} Nov 27 16:31:16 crc kubenswrapper[4707]: I1127 16:31:16.568964 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cknzd" podStartSLOduration=3.104113799 podStartE2EDuration="5.568942221s" podCreationTimestamp="2025-11-27 16:31:11 +0000 UTC" firstStartedPulling="2025-11-27 16:31:13.507720995 +0000 UTC m=+1649.139169773" lastFinishedPulling="2025-11-27 16:31:15.972549417 +0000 UTC m=+1651.603998195" observedRunningTime="2025-11-27 16:31:16.567348362 +0000 UTC m=+1652.198797140" watchObservedRunningTime="2025-11-27 16:31:16.568942221 +0000 UTC m=+1652.200390999" Nov 27 16:31:22 crc kubenswrapper[4707]: I1127 16:31:22.053193 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:22 crc kubenswrapper[4707]: I1127 16:31:22.053685 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:22 crc kubenswrapper[4707]: I1127 16:31:22.103909 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:22 crc kubenswrapper[4707]: I1127 16:31:22.690467 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:22 crc kubenswrapper[4707]: I1127 16:31:22.757291 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cknzd"] Nov 27 16:31:24 crc kubenswrapper[4707]: I1127 16:31:24.293878 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:24 crc kubenswrapper[4707]: I1127 16:31:24.380456 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:24 crc kubenswrapper[4707]: I1127 16:31:24.640385 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cknzd" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" containerName="registry-server" containerID="cri-o://5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd" gracePeriod=2 Nov 27 16:31:24 crc kubenswrapper[4707]: I1127 16:31:24.754681 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzxx2"] Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.126560 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.192347 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w9ll\" (UniqueName: \"kubernetes.io/projected/78d0125e-40cb-4838-a127-d5626de4eff4-kube-api-access-8w9ll\") pod \"78d0125e-40cb-4838-a127-d5626de4eff4\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.192443 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-catalog-content\") pod \"78d0125e-40cb-4838-a127-d5626de4eff4\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.192467 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-utilities\") pod \"78d0125e-40cb-4838-a127-d5626de4eff4\" (UID: \"78d0125e-40cb-4838-a127-d5626de4eff4\") " Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.193239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-utilities" (OuterVolumeSpecName: "utilities") pod "78d0125e-40cb-4838-a127-d5626de4eff4" (UID: "78d0125e-40cb-4838-a127-d5626de4eff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.200874 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d0125e-40cb-4838-a127-d5626de4eff4-kube-api-access-8w9ll" (OuterVolumeSpecName: "kube-api-access-8w9ll") pod "78d0125e-40cb-4838-a127-d5626de4eff4" (UID: "78d0125e-40cb-4838-a127-d5626de4eff4"). InnerVolumeSpecName "kube-api-access-8w9ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.210953 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78d0125e-40cb-4838-a127-d5626de4eff4" (UID: "78d0125e-40cb-4838-a127-d5626de4eff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.294537 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w9ll\" (UniqueName: \"kubernetes.io/projected/78d0125e-40cb-4838-a127-d5626de4eff4-kube-api-access-8w9ll\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.294670 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.294764 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d0125e-40cb-4838-a127-d5626de4eff4-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.659585 4707 generic.go:334] "Generic (PLEG): container finished" podID="78d0125e-40cb-4838-a127-d5626de4eff4" containerID="5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd" exitCode=0 Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.659654 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cknzd" event={"ID":"78d0125e-40cb-4838-a127-d5626de4eff4","Type":"ContainerDied","Data":"5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd"} Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.659748 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cknzd" event={"ID":"78d0125e-40cb-4838-a127-d5626de4eff4","Type":"ContainerDied","Data":"bb042675eb150a9c353b13468f4ce2520f5d88c09041b229db7a2656f483fd5b"} Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.659781 4707 scope.go:117] "RemoveContainer" containerID="5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.659680 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cknzd" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.659917 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzxx2" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="registry-server" containerID="cri-o://fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c" gracePeriod=2 Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.706612 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cknzd"] Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.712215 4707 scope.go:117] "RemoveContainer" containerID="a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.731289 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cknzd"] Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.754953 4707 scope.go:117] "RemoveContainer" containerID="a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.939711 4707 scope.go:117] "RemoveContainer" containerID="5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd" Nov 27 16:31:25 crc kubenswrapper[4707]: E1127 16:31:25.940984 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd\": container with ID starting with 5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd not found: ID does not exist" containerID="5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.941064 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd"} err="failed to get container status \"5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd\": rpc error: code = NotFound desc = could not find container \"5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd\": container with ID starting with 5cb552fd14c85c0ff4d478147478c3a63e01155a206c92292b101ab6a1f795cd not found: ID does not exist" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.941105 4707 scope.go:117] "RemoveContainer" containerID="a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6" Nov 27 16:31:25 crc kubenswrapper[4707]: E1127 16:31:25.941683 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6\": container with ID starting with a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6 not found: ID does not exist" containerID="a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.941744 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6"} err="failed to get container status \"a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6\": rpc error: code = NotFound desc = could not find container \"a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6\": container with ID starting with a80a756fff326e11994c7843aa1e86bce282c65ef239a4987879352f046076e6 not found: ID does not exist" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.941782 4707 scope.go:117] "RemoveContainer" containerID="a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4" Nov 27 16:31:25 crc kubenswrapper[4707]: E1127 16:31:25.942201 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4\": container with ID starting with a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4 not found: ID does not exist" containerID="a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4" Nov 27 16:31:25 crc kubenswrapper[4707]: I1127 16:31:25.942253 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4"} err="failed to get container status \"a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4\": rpc error: code = NotFound desc = could not find container \"a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4\": container with ID starting with a3cf66a6f3ef65690c8f7ab24b6b2ae69b34be7fc9e54f587bcc192e5fa329f4 not found: ID does not exist" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.142962 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.195342 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:31:26 crc kubenswrapper[4707]: E1127 16:31:26.195679 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.215102 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-utilities\") pod \"0217b698-ad65-433c-8895-a21db1b6ca3d\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.215166 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgctk\" (UniqueName: \"kubernetes.io/projected/0217b698-ad65-433c-8895-a21db1b6ca3d-kube-api-access-lgctk\") pod \"0217b698-ad65-433c-8895-a21db1b6ca3d\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.215438 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-catalog-content\") pod \"0217b698-ad65-433c-8895-a21db1b6ca3d\" (UID: \"0217b698-ad65-433c-8895-a21db1b6ca3d\") " Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.215976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-utilities" (OuterVolumeSpecName: "utilities") pod "0217b698-ad65-433c-8895-a21db1b6ca3d" (UID: "0217b698-ad65-433c-8895-a21db1b6ca3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.220034 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0217b698-ad65-433c-8895-a21db1b6ca3d-kube-api-access-lgctk" (OuterVolumeSpecName: "kube-api-access-lgctk") pod "0217b698-ad65-433c-8895-a21db1b6ca3d" (UID: "0217b698-ad65-433c-8895-a21db1b6ca3d"). InnerVolumeSpecName "kube-api-access-lgctk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.305350 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0217b698-ad65-433c-8895-a21db1b6ca3d" (UID: "0217b698-ad65-433c-8895-a21db1b6ca3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.317923 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.318377 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0217b698-ad65-433c-8895-a21db1b6ca3d-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.318411 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgctk\" (UniqueName: \"kubernetes.io/projected/0217b698-ad65-433c-8895-a21db1b6ca3d-kube-api-access-lgctk\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.677469 4707 generic.go:334] "Generic (PLEG): container finished" podID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerID="fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c" exitCode=0 Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.677548 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzxx2" event={"ID":"0217b698-ad65-433c-8895-a21db1b6ca3d","Type":"ContainerDied","Data":"fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c"} Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.677601 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzxx2" event={"ID":"0217b698-ad65-433c-8895-a21db1b6ca3d","Type":"ContainerDied","Data":"4da2fd4f4b00d9872d65689a0fd64a13af002b7da09c5acaa402c625b08fd013"} Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.677623 4707 scope.go:117] "RemoveContainer" containerID="fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.677646 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzxx2" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.710718 4707 scope.go:117] "RemoveContainer" containerID="e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.725434 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzxx2"] Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.735017 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzxx2"] Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.745371 4707 scope.go:117] "RemoveContainer" containerID="c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.766849 4707 scope.go:117] "RemoveContainer" containerID="fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c" Nov 27 16:31:26 crc kubenswrapper[4707]: E1127 16:31:26.767693 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c\": container with ID starting with fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c not found: ID does not exist" containerID="fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.767731 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c"} err="failed to get container status \"fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c\": rpc error: code = NotFound desc = could not find container \"fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c\": container with ID starting with fd35ea4b0f6c5a07f2c9bf469cd37bb5dc7c87adef2eff05a43fd52c6111669c not found: ID does not exist" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.767762 4707 scope.go:117] "RemoveContainer" containerID="e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b" Nov 27 16:31:26 crc kubenswrapper[4707]: E1127 16:31:26.768018 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b\": container with ID starting with e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b not found: ID does not exist" containerID="e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.768044 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b"} err="failed to get container status \"e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b\": rpc error: code = NotFound desc = could not find container \"e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b\": container with ID starting with e4cc2e93029ccdb610fb4dbc40be97459927bdc1854162ad6b7382823557f67b not found: ID does not exist" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.768057 4707 scope.go:117] "RemoveContainer" containerID="c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f" Nov 27 16:31:26 crc kubenswrapper[4707]: E1127 16:31:26.768357 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f\": container with ID starting with c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f not found: ID does not exist" containerID="c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f" Nov 27 16:31:26 crc kubenswrapper[4707]: I1127 16:31:26.768446 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f"} err="failed to get container status \"c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f\": rpc error: code = NotFound desc = could not find container \"c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f\": container with ID starting with c9fe5a1dabe623babb13e75273c6455802450277783fd1b7ca5d18fb54f77d7f not found: ID does not exist" Nov 27 16:31:27 crc kubenswrapper[4707]: I1127 16:31:27.211108 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" path="/var/lib/kubelet/pods/0217b698-ad65-433c-8895-a21db1b6ca3d/volumes" Nov 27 16:31:27 crc kubenswrapper[4707]: I1127 16:31:27.212310 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" path="/var/lib/kubelet/pods/78d0125e-40cb-4838-a127-d5626de4eff4/volumes" Nov 27 16:31:40 crc kubenswrapper[4707]: I1127 16:31:40.195905 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:31:40 crc kubenswrapper[4707]: E1127 16:31:40.196730 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:31:51 crc kubenswrapper[4707]: I1127 16:31:51.196112 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:31:51 crc kubenswrapper[4707]: E1127 16:31:51.197398 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:31:53 crc kubenswrapper[4707]: I1127 16:31:53.062617 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qd5lp"] Nov 27 16:31:53 crc kubenswrapper[4707]: I1127 16:31:53.079045 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qd5lp"] Nov 27 16:31:53 crc kubenswrapper[4707]: I1127 16:31:53.216651 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ce055f-5c4d-43ae-895c-4632afdacd87" path="/var/lib/kubelet/pods/63ce055f-5c4d-43ae-895c-4632afdacd87/volumes" Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.047344 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9fba-account-create-update-bb8t5"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.059054 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b684-account-create-update-7n56j"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.070131 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-54ea-account-create-update-lm8hw"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.079474 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rz6lr"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.087680 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tmmj9"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.094580 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9fba-account-create-update-bb8t5"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.100733 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b684-account-create-update-7n56j"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.106704 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tmmj9"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.113968 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rz6lr"] Nov 27 16:31:54 crc kubenswrapper[4707]: I1127 16:31:54.120480 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-54ea-account-create-update-lm8hw"] Nov 27 16:31:55 crc kubenswrapper[4707]: I1127 16:31:55.023458 4707 generic.go:334] "Generic (PLEG): container finished" podID="86c589a6-19e5-48cc-8db8-42af5ae0f078" containerID="3aebad07869fea6d53f94450a3ab39494a24f52ddf279465442591074b413a66" exitCode=0 Nov 27 16:31:55 crc kubenswrapper[4707]: I1127 16:31:55.023565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" event={"ID":"86c589a6-19e5-48cc-8db8-42af5ae0f078","Type":"ContainerDied","Data":"3aebad07869fea6d53f94450a3ab39494a24f52ddf279465442591074b413a66"} Nov 27 16:31:55 crc kubenswrapper[4707]: I1127 16:31:55.209080 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca24dad-8ffd-41b5-9379-b05c90193e9e" path="/var/lib/kubelet/pods/3ca24dad-8ffd-41b5-9379-b05c90193e9e/volumes" Nov 27 16:31:55 crc kubenswrapper[4707]: I1127 16:31:55.209744 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b3a42f-b2e8-46ae-b500-3b2de0b501c7" path="/var/lib/kubelet/pods/54b3a42f-b2e8-46ae-b500-3b2de0b501c7/volumes" Nov 27 16:31:55 crc kubenswrapper[4707]: I1127 16:31:55.210273 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581592db-9e10-4a98-a03d-598ce54b0c74" path="/var/lib/kubelet/pods/581592db-9e10-4a98-a03d-598ce54b0c74/volumes" Nov 27 16:31:55 crc kubenswrapper[4707]: I1127 16:31:55.210968 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a68c0c-cdcb-4e30-bb21-051e9bdbf395" path="/var/lib/kubelet/pods/b6a68c0c-cdcb-4e30-bb21-051e9bdbf395/volumes" Nov 27 16:31:55 crc kubenswrapper[4707]: I1127 16:31:55.211959 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33b6312-639a-429c-88ae-5c60ec56280c" path="/var/lib/kubelet/pods/c33b6312-639a-429c-88ae-5c60ec56280c/volumes" Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.505449 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.570286 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-ssh-key\") pod \"86c589a6-19e5-48cc-8db8-42af5ae0f078\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.570433 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gw5l\" (UniqueName: \"kubernetes.io/projected/86c589a6-19e5-48cc-8db8-42af5ae0f078-kube-api-access-6gw5l\") pod \"86c589a6-19e5-48cc-8db8-42af5ae0f078\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.570619 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-inventory\") pod \"86c589a6-19e5-48cc-8db8-42af5ae0f078\" (UID: \"86c589a6-19e5-48cc-8db8-42af5ae0f078\") " Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.577674 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c589a6-19e5-48cc-8db8-42af5ae0f078-kube-api-access-6gw5l" (OuterVolumeSpecName: "kube-api-access-6gw5l") pod "86c589a6-19e5-48cc-8db8-42af5ae0f078" (UID: "86c589a6-19e5-48cc-8db8-42af5ae0f078"). InnerVolumeSpecName "kube-api-access-6gw5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.598539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "86c589a6-19e5-48cc-8db8-42af5ae0f078" (UID: "86c589a6-19e5-48cc-8db8-42af5ae0f078"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.621831 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-inventory" (OuterVolumeSpecName: "inventory") pod "86c589a6-19e5-48cc-8db8-42af5ae0f078" (UID: "86c589a6-19e5-48cc-8db8-42af5ae0f078"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.674814 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gw5l\" (UniqueName: \"kubernetes.io/projected/86c589a6-19e5-48cc-8db8-42af5ae0f078-kube-api-access-6gw5l\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.674863 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:56 crc kubenswrapper[4707]: I1127 16:31:56.674891 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c589a6-19e5-48cc-8db8-42af5ae0f078-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.066884 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" event={"ID":"86c589a6-19e5-48cc-8db8-42af5ae0f078","Type":"ContainerDied","Data":"c7a06a1be2e990892a937af57c69e537f274257613a967627a7d1f2127560b16"} Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.066933 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7a06a1be2e990892a937af57c69e537f274257613a967627a7d1f2127560b16" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.066997 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.167388 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm"] Nov 27 16:31:57 crc kubenswrapper[4707]: E1127 16:31:57.168010 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="extract-content" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168039 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="extract-content" Nov 27 16:31:57 crc kubenswrapper[4707]: E1127 16:31:57.168093 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c589a6-19e5-48cc-8db8-42af5ae0f078" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168110 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c589a6-19e5-48cc-8db8-42af5ae0f078" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 16:31:57 crc kubenswrapper[4707]: E1127 16:31:57.168176 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" containerName="registry-server" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168197 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" containerName="registry-server" Nov 27 16:31:57 crc kubenswrapper[4707]: E1127 16:31:57.168221 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" containerName="extract-content" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168239 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" containerName="extract-content" Nov 27 16:31:57 crc kubenswrapper[4707]: E1127 16:31:57.168263 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="extract-utilities" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168282 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="extract-utilities" Nov 27 16:31:57 crc kubenswrapper[4707]: E1127 16:31:57.168333 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="registry-server" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168350 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="registry-server" Nov 27 16:31:57 crc kubenswrapper[4707]: E1127 16:31:57.168402 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" containerName="extract-utilities" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168421 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" containerName="extract-utilities" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168782 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c589a6-19e5-48cc-8db8-42af5ae0f078" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168851 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d0125e-40cb-4838-a127-d5626de4eff4" containerName="registry-server" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.168886 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="0217b698-ad65-433c-8895-a21db1b6ca3d" containerName="registry-server" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.170032 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.172355 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.173418 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.173935 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.174355 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.177906 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm"] Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.297287 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm89m\" (UniqueName: \"kubernetes.io/projected/4e28fa93-baff-4fad-91cc-7ef262dcd775-kube-api-access-bm89m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.297446 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.297514 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.399268 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.399547 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm89m\" (UniqueName: \"kubernetes.io/projected/4e28fa93-baff-4fad-91cc-7ef262dcd775-kube-api-access-bm89m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.399793 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.404152 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.405984 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.415058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm89m\" (UniqueName: \"kubernetes.io/projected/4e28fa93-baff-4fad-91cc-7ef262dcd775-kube-api-access-bm89m\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:57 crc kubenswrapper[4707]: I1127 16:31:57.498168 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:31:58 crc kubenswrapper[4707]: I1127 16:31:58.134611 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm"] Nov 27 16:31:59 crc kubenswrapper[4707]: I1127 16:31:59.100859 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" event={"ID":"4e28fa93-baff-4fad-91cc-7ef262dcd775","Type":"ContainerStarted","Data":"f768c9eb0585161455b87a2424556dc714fc47aab77199bbd1c5f30fe504f552"} Nov 27 16:31:59 crc kubenswrapper[4707]: I1127 16:31:59.101487 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" event={"ID":"4e28fa93-baff-4fad-91cc-7ef262dcd775","Type":"ContainerStarted","Data":"bc2bd462164fc2914cefbabc6755827be8a302ed2adf597a9464b0d1b159b00a"} Nov 27 16:31:59 crc kubenswrapper[4707]: I1127 16:31:59.125597 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" podStartSLOduration=1.534422756 podStartE2EDuration="2.125566256s" podCreationTimestamp="2025-11-27 16:31:57 +0000 UTC" firstStartedPulling="2025-11-27 16:31:58.141019403 +0000 UTC m=+1693.772468181" lastFinishedPulling="2025-11-27 16:31:58.732162873 +0000 UTC m=+1694.363611681" observedRunningTime="2025-11-27 16:31:59.122014649 +0000 UTC m=+1694.753463457" watchObservedRunningTime="2025-11-27 16:31:59.125566256 +0000 UTC m=+1694.757015064" Nov 27 16:32:04 crc kubenswrapper[4707]: I1127 16:32:04.148141 4707 generic.go:334] "Generic (PLEG): container finished" podID="4e28fa93-baff-4fad-91cc-7ef262dcd775" containerID="f768c9eb0585161455b87a2424556dc714fc47aab77199bbd1c5f30fe504f552" exitCode=0 Nov 27 16:32:04 crc kubenswrapper[4707]: I1127 16:32:04.148273 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" event={"ID":"4e28fa93-baff-4fad-91cc-7ef262dcd775","Type":"ContainerDied","Data":"f768c9eb0585161455b87a2424556dc714fc47aab77199bbd1c5f30fe504f552"} Nov 27 16:32:04 crc kubenswrapper[4707]: I1127 16:32:04.195876 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:32:04 crc kubenswrapper[4707]: E1127 16:32:04.196514 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.616529 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.674313 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-ssh-key\") pod \"4e28fa93-baff-4fad-91cc-7ef262dcd775\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.674404 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm89m\" (UniqueName: \"kubernetes.io/projected/4e28fa93-baff-4fad-91cc-7ef262dcd775-kube-api-access-bm89m\") pod \"4e28fa93-baff-4fad-91cc-7ef262dcd775\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.674426 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-inventory\") pod \"4e28fa93-baff-4fad-91cc-7ef262dcd775\" (UID: \"4e28fa93-baff-4fad-91cc-7ef262dcd775\") " Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.685808 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e28fa93-baff-4fad-91cc-7ef262dcd775-kube-api-access-bm89m" (OuterVolumeSpecName: "kube-api-access-bm89m") pod "4e28fa93-baff-4fad-91cc-7ef262dcd775" (UID: "4e28fa93-baff-4fad-91cc-7ef262dcd775"). InnerVolumeSpecName "kube-api-access-bm89m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.724978 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e28fa93-baff-4fad-91cc-7ef262dcd775" (UID: "4e28fa93-baff-4fad-91cc-7ef262dcd775"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.726763 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-inventory" (OuterVolumeSpecName: "inventory") pod "4e28fa93-baff-4fad-91cc-7ef262dcd775" (UID: "4e28fa93-baff-4fad-91cc-7ef262dcd775"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.776939 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.776994 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm89m\" (UniqueName: \"kubernetes.io/projected/4e28fa93-baff-4fad-91cc-7ef262dcd775-kube-api-access-bm89m\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:05 crc kubenswrapper[4707]: I1127 16:32:05.777015 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e28fa93-baff-4fad-91cc-7ef262dcd775-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.175899 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" event={"ID":"4e28fa93-baff-4fad-91cc-7ef262dcd775","Type":"ContainerDied","Data":"bc2bd462164fc2914cefbabc6755827be8a302ed2adf597a9464b0d1b159b00a"} Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.175945 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.175975 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc2bd462164fc2914cefbabc6755827be8a302ed2adf597a9464b0d1b159b00a" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.263123 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d"] Nov 27 16:32:06 crc kubenswrapper[4707]: E1127 16:32:06.265657 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e28fa93-baff-4fad-91cc-7ef262dcd775" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.265678 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e28fa93-baff-4fad-91cc-7ef262dcd775" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.265863 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e28fa93-baff-4fad-91cc-7ef262dcd775" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.266482 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.268512 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.268639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.268737 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.281078 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.283076 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d"] Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.287609 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.287697 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.288006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4npj\" (UniqueName: \"kubernetes.io/projected/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-kube-api-access-m4npj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.389418 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4npj\" (UniqueName: \"kubernetes.io/projected/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-kube-api-access-m4npj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.389517 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.389580 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.394671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.403427 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.409750 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4npj\" (UniqueName: \"kubernetes.io/projected/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-kube-api-access-m4npj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wgj2d\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.589213 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:06 crc kubenswrapper[4707]: I1127 16:32:06.943631 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d"] Nov 27 16:32:07 crc kubenswrapper[4707]: I1127 16:32:07.190132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" event={"ID":"f3e34a79-7842-4f97-91f4-040a1b4e5b2b","Type":"ContainerStarted","Data":"285326576a9c7fc119e53a62ec514c0a8f04d3a2910fdf6f00fad8cc38eee9de"} Nov 27 16:32:08 crc kubenswrapper[4707]: I1127 16:32:08.202509 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" event={"ID":"f3e34a79-7842-4f97-91f4-040a1b4e5b2b","Type":"ContainerStarted","Data":"99fc914707e84ffab29aead824172d8071d4d7235a19b1bb9155b4ef1d494558"} Nov 27 16:32:08 crc kubenswrapper[4707]: I1127 16:32:08.235241 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" podStartSLOduration=1.396652296 podStartE2EDuration="2.235214405s" podCreationTimestamp="2025-11-27 16:32:06 +0000 UTC" firstStartedPulling="2025-11-27 16:32:06.949640944 +0000 UTC m=+1702.581089722" lastFinishedPulling="2025-11-27 16:32:07.788203023 +0000 UTC m=+1703.419651831" observedRunningTime="2025-11-27 16:32:08.222898375 +0000 UTC m=+1703.854347143" watchObservedRunningTime="2025-11-27 16:32:08.235214405 +0000 UTC m=+1703.866663203" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.115736 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9n86h"] Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.118344 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.143491 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9n86h"] Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.168930 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-utilities\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.169074 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-catalog-content\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.169144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6bf\" (UniqueName: \"kubernetes.io/projected/fcb8cbad-0cab-4a21-add3-7663d4d063c2-kube-api-access-nt6bf\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.271363 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6bf\" (UniqueName: \"kubernetes.io/projected/fcb8cbad-0cab-4a21-add3-7663d4d063c2-kube-api-access-nt6bf\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.271556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-utilities\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.272294 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-catalog-content\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.272382 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-utilities\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.272955 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-catalog-content\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.295479 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6bf\" (UniqueName: \"kubernetes.io/projected/fcb8cbad-0cab-4a21-add3-7663d4d063c2-kube-api-access-nt6bf\") pod \"community-operators-9n86h\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:10 crc kubenswrapper[4707]: I1127 16:32:10.446971 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:11 crc kubenswrapper[4707]: I1127 16:32:11.004323 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9n86h"] Nov 27 16:32:11 crc kubenswrapper[4707]: I1127 16:32:11.232644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n86h" event={"ID":"fcb8cbad-0cab-4a21-add3-7663d4d063c2","Type":"ContainerStarted","Data":"05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885"} Nov 27 16:32:11 crc kubenswrapper[4707]: I1127 16:32:11.233095 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n86h" event={"ID":"fcb8cbad-0cab-4a21-add3-7663d4d063c2","Type":"ContainerStarted","Data":"2e832a40e932aad7ee6d7b972f5fc395d7a7bf4080768d0c01bf14a3fbd44d20"} Nov 27 16:32:12 crc kubenswrapper[4707]: I1127 16:32:12.246974 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerID="05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885" exitCode=0 Nov 27 16:32:12 crc kubenswrapper[4707]: I1127 16:32:12.247140 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n86h" event={"ID":"fcb8cbad-0cab-4a21-add3-7663d4d063c2","Type":"ContainerDied","Data":"05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885"} Nov 27 16:32:12 crc kubenswrapper[4707]: I1127 16:32:12.740019 4707 scope.go:117] "RemoveContainer" containerID="51971a8cac2fe3fe988afa9603a96e0609318f38a80b8c9d39888e823d1f3ea3" Nov 27 16:32:12 crc kubenswrapper[4707]: I1127 16:32:12.774833 4707 scope.go:117] "RemoveContainer" containerID="e9df8dd031cc4dac2d9ab317241860be6d935a1ea342b1f2a92a31676e2cf3b1" Nov 27 16:32:12 crc kubenswrapper[4707]: I1127 16:32:12.854345 4707 scope.go:117] "RemoveContainer" containerID="7c9ca57692645ecd6590f8d90bf57e228ca418c5d7aa57dfcd502d32f6aa8032" Nov 27 16:32:12 crc kubenswrapper[4707]: I1127 16:32:12.972836 4707 scope.go:117] "RemoveContainer" containerID="cf31e44bf7ed767adb9470ab61b062790ba67711c1575d013ef009d2a36f11b1" Nov 27 16:32:13 crc kubenswrapper[4707]: I1127 16:32:13.000724 4707 scope.go:117] "RemoveContainer" containerID="18aced7ffc8c3afdb0f5d4df0d9b04e688a1696a570ff8576f5c3b4fe2b0968f" Nov 27 16:32:13 crc kubenswrapper[4707]: I1127 16:32:13.047046 4707 scope.go:117] "RemoveContainer" containerID="70748516c6edb297226908178a5f41325b0eaeb83dcfddeedf8cd3e0fe7c91d6" Nov 27 16:32:13 crc kubenswrapper[4707]: I1127 16:32:13.080641 4707 scope.go:117] "RemoveContainer" containerID="ebc874000223527b4eeb598365e5b96ae24efcb1aed3fa9e2f28aca453f514c6" Nov 27 16:32:13 crc kubenswrapper[4707]: I1127 16:32:13.117855 4707 scope.go:117] "RemoveContainer" containerID="c65c0b619b3ced482b63ee64a2e4708e62b90c0f87f26da5cd3143f9b28bdda2" Nov 27 16:32:14 crc kubenswrapper[4707]: I1127 16:32:14.292441 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n86h" event={"ID":"fcb8cbad-0cab-4a21-add3-7663d4d063c2","Type":"ContainerStarted","Data":"2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e"} Nov 27 16:32:15 crc kubenswrapper[4707]: I1127 16:32:15.307475 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerID="2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e" exitCode=0 Nov 27 16:32:15 crc kubenswrapper[4707]: I1127 16:32:15.307576 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n86h" event={"ID":"fcb8cbad-0cab-4a21-add3-7663d4d063c2","Type":"ContainerDied","Data":"2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e"} Nov 27 16:32:17 crc kubenswrapper[4707]: I1127 16:32:17.334515 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n86h" event={"ID":"fcb8cbad-0cab-4a21-add3-7663d4d063c2","Type":"ContainerStarted","Data":"ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a"} Nov 27 16:32:17 crc kubenswrapper[4707]: I1127 16:32:17.378007 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9n86h" podStartSLOduration=3.30177317 podStartE2EDuration="7.377979042s" podCreationTimestamp="2025-11-27 16:32:10 +0000 UTC" firstStartedPulling="2025-11-27 16:32:12.251434471 +0000 UTC m=+1707.882883269" lastFinishedPulling="2025-11-27 16:32:16.327640373 +0000 UTC m=+1711.959089141" observedRunningTime="2025-11-27 16:32:17.364080903 +0000 UTC m=+1712.995529771" watchObservedRunningTime="2025-11-27 16:32:17.377979042 +0000 UTC m=+1713.009427820" Nov 27 16:32:19 crc kubenswrapper[4707]: I1127 16:32:19.196496 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:32:19 crc kubenswrapper[4707]: E1127 16:32:19.196893 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:32:20 crc kubenswrapper[4707]: I1127 16:32:20.448739 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:20 crc kubenswrapper[4707]: I1127 16:32:20.449029 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:20 crc kubenswrapper[4707]: I1127 16:32:20.520649 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:21 crc kubenswrapper[4707]: I1127 16:32:21.476491 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:21 crc kubenswrapper[4707]: I1127 16:32:21.557794 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9n86h"] Nov 27 16:32:23 crc kubenswrapper[4707]: I1127 16:32:23.412362 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9n86h" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerName="registry-server" containerID="cri-o://ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a" gracePeriod=2 Nov 27 16:32:23 crc kubenswrapper[4707]: I1127 16:32:23.904412 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.046935 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hq68q"] Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.057850 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hq68q"] Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.067223 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt6bf\" (UniqueName: \"kubernetes.io/projected/fcb8cbad-0cab-4a21-add3-7663d4d063c2-kube-api-access-nt6bf\") pod \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.067324 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-catalog-content\") pod \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.067463 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-utilities\") pod \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\" (UID: \"fcb8cbad-0cab-4a21-add3-7663d4d063c2\") " Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.068569 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-utilities" (OuterVolumeSpecName: "utilities") pod "fcb8cbad-0cab-4a21-add3-7663d4d063c2" (UID: "fcb8cbad-0cab-4a21-add3-7663d4d063c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.073100 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb8cbad-0cab-4a21-add3-7663d4d063c2-kube-api-access-nt6bf" (OuterVolumeSpecName: "kube-api-access-nt6bf") pod "fcb8cbad-0cab-4a21-add3-7663d4d063c2" (UID: "fcb8cbad-0cab-4a21-add3-7663d4d063c2"). InnerVolumeSpecName "kube-api-access-nt6bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.126530 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcb8cbad-0cab-4a21-add3-7663d4d063c2" (UID: "fcb8cbad-0cab-4a21-add3-7663d4d063c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.171670 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.171715 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt6bf\" (UniqueName: \"kubernetes.io/projected/fcb8cbad-0cab-4a21-add3-7663d4d063c2-kube-api-access-nt6bf\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.171729 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb8cbad-0cab-4a21-add3-7663d4d063c2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.423227 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerID="ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a" exitCode=0 Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.423267 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n86h" event={"ID":"fcb8cbad-0cab-4a21-add3-7663d4d063c2","Type":"ContainerDied","Data":"ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a"} Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.423293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9n86h" event={"ID":"fcb8cbad-0cab-4a21-add3-7663d4d063c2","Type":"ContainerDied","Data":"2e832a40e932aad7ee6d7b972f5fc395d7a7bf4080768d0c01bf14a3fbd44d20"} Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.423311 4707 scope.go:117] "RemoveContainer" containerID="ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.423351 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9n86h" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.460788 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9n86h"] Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.470304 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9n86h"] Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.472076 4707 scope.go:117] "RemoveContainer" containerID="2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.509571 4707 scope.go:117] "RemoveContainer" containerID="05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.540241 4707 scope.go:117] "RemoveContainer" containerID="ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a" Nov 27 16:32:24 crc kubenswrapper[4707]: E1127 16:32:24.540682 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a\": container with ID starting with ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a not found: ID does not exist" containerID="ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.540714 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a"} err="failed to get container status \"ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a\": rpc error: code = NotFound desc = could not find container \"ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a\": container with ID starting with ca96a89be69ae3bba7ae5f078c1e9762fc173d5b8338c75705c2aba2e7ea0d6a not found: ID does not exist" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.540739 4707 scope.go:117] "RemoveContainer" containerID="2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e" Nov 27 16:32:24 crc kubenswrapper[4707]: E1127 16:32:24.541101 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e\": container with ID starting with 2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e not found: ID does not exist" containerID="2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.541134 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e"} err="failed to get container status \"2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e\": rpc error: code = NotFound desc = could not find container \"2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e\": container with ID starting with 2d8aa075a8fde6e9362a98043018108ce23dcfc7e707984e2ca7517ed2fc190e not found: ID does not exist" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.541152 4707 scope.go:117] "RemoveContainer" containerID="05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885" Nov 27 16:32:24 crc kubenswrapper[4707]: E1127 16:32:24.541472 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885\": container with ID starting with 05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885 not found: ID does not exist" containerID="05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885" Nov 27 16:32:24 crc kubenswrapper[4707]: I1127 16:32:24.541493 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885"} err="failed to get container status \"05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885\": rpc error: code = NotFound desc = could not find container \"05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885\": container with ID starting with 05f6cf3ab547472ca45bf39ceb27c46c4307885fdcb2c86f8ebd634ff4df9885 not found: ID does not exist" Nov 27 16:32:25 crc kubenswrapper[4707]: I1127 16:32:25.211007 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d822ec-0bcb-4a76-a135-2a140d0618c8" path="/var/lib/kubelet/pods/11d822ec-0bcb-4a76-a135-2a140d0618c8/volumes" Nov 27 16:32:25 crc kubenswrapper[4707]: I1127 16:32:25.212115 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" path="/var/lib/kubelet/pods/fcb8cbad-0cab-4a21-add3-7663d4d063c2/volumes" Nov 27 16:32:32 crc kubenswrapper[4707]: I1127 16:32:32.197127 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:32:32 crc kubenswrapper[4707]: E1127 16:32:32.198325 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:32:41 crc kubenswrapper[4707]: I1127 16:32:41.041800 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-krw8h"] Nov 27 16:32:41 crc kubenswrapper[4707]: I1127 16:32:41.054146 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-krw8h"] Nov 27 16:32:41 crc kubenswrapper[4707]: I1127 16:32:41.205155 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="831d6788-637b-4a22-8ed9-1f39e8e277a0" path="/var/lib/kubelet/pods/831d6788-637b-4a22-8ed9-1f39e8e277a0/volumes" Nov 27 16:32:42 crc kubenswrapper[4707]: I1127 16:32:42.033267 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb5wv"] Nov 27 16:32:42 crc kubenswrapper[4707]: I1127 16:32:42.043755 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb5wv"] Nov 27 16:32:43 crc kubenswrapper[4707]: I1127 16:32:43.209952 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e14300b4-67fd-4da5-8741-879007e38268" path="/var/lib/kubelet/pods/e14300b4-67fd-4da5-8741-879007e38268/volumes" Nov 27 16:32:46 crc kubenswrapper[4707]: I1127 16:32:46.195147 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:32:46 crc kubenswrapper[4707]: E1127 16:32:46.195655 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:32:50 crc kubenswrapper[4707]: I1127 16:32:50.721389 4707 generic.go:334] "Generic (PLEG): container finished" podID="f3e34a79-7842-4f97-91f4-040a1b4e5b2b" containerID="99fc914707e84ffab29aead824172d8071d4d7235a19b1bb9155b4ef1d494558" exitCode=0 Nov 27 16:32:50 crc kubenswrapper[4707]: I1127 16:32:50.721414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" event={"ID":"f3e34a79-7842-4f97-91f4-040a1b4e5b2b","Type":"ContainerDied","Data":"99fc914707e84ffab29aead824172d8071d4d7235a19b1bb9155b4ef1d494558"} Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.197159 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.306294 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-inventory\") pod \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.307078 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-ssh-key\") pod \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.307322 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4npj\" (UniqueName: \"kubernetes.io/projected/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-kube-api-access-m4npj\") pod \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\" (UID: \"f3e34a79-7842-4f97-91f4-040a1b4e5b2b\") " Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.312576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-kube-api-access-m4npj" (OuterVolumeSpecName: "kube-api-access-m4npj") pod "f3e34a79-7842-4f97-91f4-040a1b4e5b2b" (UID: "f3e34a79-7842-4f97-91f4-040a1b4e5b2b"). InnerVolumeSpecName "kube-api-access-m4npj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.333154 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f3e34a79-7842-4f97-91f4-040a1b4e5b2b" (UID: "f3e34a79-7842-4f97-91f4-040a1b4e5b2b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.342489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-inventory" (OuterVolumeSpecName: "inventory") pod "f3e34a79-7842-4f97-91f4-040a1b4e5b2b" (UID: "f3e34a79-7842-4f97-91f4-040a1b4e5b2b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.429253 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.429334 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4npj\" (UniqueName: \"kubernetes.io/projected/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-kube-api-access-m4npj\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.429356 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e34a79-7842-4f97-91f4-040a1b4e5b2b-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.746305 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" event={"ID":"f3e34a79-7842-4f97-91f4-040a1b4e5b2b","Type":"ContainerDied","Data":"285326576a9c7fc119e53a62ec514c0a8f04d3a2910fdf6f00fad8cc38eee9de"} Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.746357 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="285326576a9c7fc119e53a62ec514c0a8f04d3a2910fdf6f00fad8cc38eee9de" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.746923 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wgj2d" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.860085 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59"] Nov 27 16:32:52 crc kubenswrapper[4707]: E1127 16:32:52.860868 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerName="registry-server" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.860893 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerName="registry-server" Nov 27 16:32:52 crc kubenswrapper[4707]: E1127 16:32:52.860909 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerName="extract-content" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.860917 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerName="extract-content" Nov 27 16:32:52 crc kubenswrapper[4707]: E1127 16:32:52.860929 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e34a79-7842-4f97-91f4-040a1b4e5b2b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.860939 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e34a79-7842-4f97-91f4-040a1b4e5b2b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:32:52 crc kubenswrapper[4707]: E1127 16:32:52.860961 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerName="extract-utilities" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.860969 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerName="extract-utilities" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.861181 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e34a79-7842-4f97-91f4-040a1b4e5b2b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.861205 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb8cbad-0cab-4a21-add3-7663d4d063c2" containerName="registry-server" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.862026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.864700 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.865670 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.866051 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.866118 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.876203 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59"] Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.939474 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.939567 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:52 crc kubenswrapper[4707]: I1127 16:32:52.939662 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7x6k\" (UniqueName: \"kubernetes.io/projected/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-kube-api-access-t7x6k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:53 crc kubenswrapper[4707]: I1127 16:32:53.042265 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:53 crc kubenswrapper[4707]: I1127 16:32:53.042361 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:53 crc kubenswrapper[4707]: I1127 16:32:53.042533 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7x6k\" (UniqueName: \"kubernetes.io/projected/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-kube-api-access-t7x6k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:53 crc kubenswrapper[4707]: I1127 16:32:53.046833 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:53 crc kubenswrapper[4707]: I1127 16:32:53.048125 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:53 crc kubenswrapper[4707]: I1127 16:32:53.073040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7x6k\" (UniqueName: \"kubernetes.io/projected/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-kube-api-access-t7x6k\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4kq59\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:53 crc kubenswrapper[4707]: I1127 16:32:53.181333 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:32:53 crc kubenswrapper[4707]: I1127 16:32:53.795615 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59"] Nov 27 16:32:54 crc kubenswrapper[4707]: I1127 16:32:54.771975 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" event={"ID":"26ad523c-9a7e-437b-a8e5-1b72a0a90d19","Type":"ContainerStarted","Data":"ce97e1ef13cd9ce051a0952c0abd8479317311b63d11d1d27757c2d3f257070c"} Nov 27 16:32:55 crc kubenswrapper[4707]: I1127 16:32:55.789588 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" event={"ID":"26ad523c-9a7e-437b-a8e5-1b72a0a90d19","Type":"ContainerStarted","Data":"67d85a28df44176e0afd3d5c74e87dc9d916919e1c1ed8a7f5f3c824baadcd6c"} Nov 27 16:32:55 crc kubenswrapper[4707]: I1127 16:32:55.824954 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" podStartSLOduration=2.869606761 podStartE2EDuration="3.824926821s" podCreationTimestamp="2025-11-27 16:32:52 +0000 UTC" firstStartedPulling="2025-11-27 16:32:53.80729747 +0000 UTC m=+1749.438746248" lastFinishedPulling="2025-11-27 16:32:54.76261751 +0000 UTC m=+1750.394066308" observedRunningTime="2025-11-27 16:32:55.815719496 +0000 UTC m=+1751.447168314" watchObservedRunningTime="2025-11-27 16:32:55.824926821 +0000 UTC m=+1751.456375629" Nov 27 16:33:00 crc kubenswrapper[4707]: I1127 16:33:00.195650 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:33:00 crc kubenswrapper[4707]: E1127 16:33:00.196815 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:33:13 crc kubenswrapper[4707]: I1127 16:33:13.414316 4707 scope.go:117] "RemoveContainer" containerID="ae09cce4caea83674a979e764de79df6336a8740b86cf3c0607b4da1bd0031ff" Nov 27 16:33:13 crc kubenswrapper[4707]: I1127 16:33:13.491179 4707 scope.go:117] "RemoveContainer" containerID="25ab784e1dfdc9d441fbd5bd3464d8122fb0d1c63df76173637c82adef4308e9" Nov 27 16:33:13 crc kubenswrapper[4707]: I1127 16:33:13.568026 4707 scope.go:117] "RemoveContainer" containerID="7ee3574ba594912d92a16ed965dc1dc2ac99166f448892527b86ac7d840a5dab" Nov 27 16:33:15 crc kubenswrapper[4707]: I1127 16:33:15.208188 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:33:15 crc kubenswrapper[4707]: E1127 16:33:15.209120 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:33:25 crc kubenswrapper[4707]: I1127 16:33:25.077939 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-crzc9"] Nov 27 16:33:25 crc kubenswrapper[4707]: I1127 16:33:25.089401 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-crzc9"] Nov 27 16:33:25 crc kubenswrapper[4707]: I1127 16:33:25.206858 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d07a58-b515-4c7b-a75e-ce5d89a9c9f5" path="/var/lib/kubelet/pods/28d07a58-b515-4c7b-a75e-ce5d89a9c9f5/volumes" Nov 27 16:33:29 crc kubenswrapper[4707]: I1127 16:33:29.203954 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:33:29 crc kubenswrapper[4707]: E1127 16:33:29.204662 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:33:44 crc kubenswrapper[4707]: I1127 16:33:44.196462 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:33:44 crc kubenswrapper[4707]: E1127 16:33:44.199553 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:33:57 crc kubenswrapper[4707]: I1127 16:33:57.495897 4707 generic.go:334] "Generic (PLEG): container finished" podID="26ad523c-9a7e-437b-a8e5-1b72a0a90d19" containerID="67d85a28df44176e0afd3d5c74e87dc9d916919e1c1ed8a7f5f3c824baadcd6c" exitCode=0 Nov 27 16:33:57 crc kubenswrapper[4707]: I1127 16:33:57.495951 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" event={"ID":"26ad523c-9a7e-437b-a8e5-1b72a0a90d19","Type":"ContainerDied","Data":"67d85a28df44176e0afd3d5c74e87dc9d916919e1c1ed8a7f5f3c824baadcd6c"} Nov 27 16:33:58 crc kubenswrapper[4707]: I1127 16:33:58.195355 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:33:58 crc kubenswrapper[4707]: E1127 16:33:58.195789 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.027830 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.188511 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7x6k\" (UniqueName: \"kubernetes.io/projected/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-kube-api-access-t7x6k\") pod \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.188609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-inventory\") pod \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.188914 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-ssh-key\") pod \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\" (UID: \"26ad523c-9a7e-437b-a8e5-1b72a0a90d19\") " Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.198635 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-kube-api-access-t7x6k" (OuterVolumeSpecName: "kube-api-access-t7x6k") pod "26ad523c-9a7e-437b-a8e5-1b72a0a90d19" (UID: "26ad523c-9a7e-437b-a8e5-1b72a0a90d19"). InnerVolumeSpecName "kube-api-access-t7x6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.218865 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-inventory" (OuterVolumeSpecName: "inventory") pod "26ad523c-9a7e-437b-a8e5-1b72a0a90d19" (UID: "26ad523c-9a7e-437b-a8e5-1b72a0a90d19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.240647 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "26ad523c-9a7e-437b-a8e5-1b72a0a90d19" (UID: "26ad523c-9a7e-437b-a8e5-1b72a0a90d19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.291844 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7x6k\" (UniqueName: \"kubernetes.io/projected/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-kube-api-access-t7x6k\") on node \"crc\" DevicePath \"\"" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.291880 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.291895 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/26ad523c-9a7e-437b-a8e5-1b72a0a90d19-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.521911 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" event={"ID":"26ad523c-9a7e-437b-a8e5-1b72a0a90d19","Type":"ContainerDied","Data":"ce97e1ef13cd9ce051a0952c0abd8479317311b63d11d1d27757c2d3f257070c"} Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.521957 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce97e1ef13cd9ce051a0952c0abd8479317311b63d11d1d27757c2d3f257070c" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.522003 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4kq59" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.634520 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9kfcd"] Nov 27 16:33:59 crc kubenswrapper[4707]: E1127 16:33:59.635501 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ad523c-9a7e-437b-a8e5-1b72a0a90d19" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.635532 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ad523c-9a7e-437b-a8e5-1b72a0a90d19" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.635818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ad523c-9a7e-437b-a8e5-1b72a0a90d19" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.636824 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.642285 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.642664 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.647679 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.649361 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.669155 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9kfcd"] Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.802177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.802608 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbw6\" (UniqueName: \"kubernetes.io/projected/29ac3bd9-0e37-4e00-aa44-a09c01019b96-kube-api-access-9lbw6\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.802790 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.905200 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.905297 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.905521 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbw6\" (UniqueName: \"kubernetes.io/projected/29ac3bd9-0e37-4e00-aa44-a09c01019b96-kube-api-access-9lbw6\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.908950 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.923274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.926322 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbw6\" (UniqueName: \"kubernetes.io/projected/29ac3bd9-0e37-4e00-aa44-a09c01019b96-kube-api-access-9lbw6\") pod \"ssh-known-hosts-edpm-deployment-9kfcd\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:33:59 crc kubenswrapper[4707]: I1127 16:33:59.959314 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:34:00 crc kubenswrapper[4707]: I1127 16:34:00.545414 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-9kfcd"] Nov 27 16:34:01 crc kubenswrapper[4707]: I1127 16:34:01.547905 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" event={"ID":"29ac3bd9-0e37-4e00-aa44-a09c01019b96","Type":"ContainerStarted","Data":"898ba38dd4155308c84892e88f95de789cf04f7d3785c077484dd274b8da3740"} Nov 27 16:34:01 crc kubenswrapper[4707]: I1127 16:34:01.548405 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" event={"ID":"29ac3bd9-0e37-4e00-aa44-a09c01019b96","Type":"ContainerStarted","Data":"e91f5a0d2a460d00f9e34c4aa26c74b7f164e2d41adea7e31f2b4d12fd8a6378"} Nov 27 16:34:01 crc kubenswrapper[4707]: I1127 16:34:01.567788 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" podStartSLOduration=2.048545191 podStartE2EDuration="2.567760395s" podCreationTimestamp="2025-11-27 16:33:59 +0000 UTC" firstStartedPulling="2025-11-27 16:34:00.546086635 +0000 UTC m=+1816.177535403" lastFinishedPulling="2025-11-27 16:34:01.065301819 +0000 UTC m=+1816.696750607" observedRunningTime="2025-11-27 16:34:01.562563068 +0000 UTC m=+1817.194011876" watchObservedRunningTime="2025-11-27 16:34:01.567760395 +0000 UTC m=+1817.199209193" Nov 27 16:34:09 crc kubenswrapper[4707]: I1127 16:34:09.652912 4707 generic.go:334] "Generic (PLEG): container finished" podID="29ac3bd9-0e37-4e00-aa44-a09c01019b96" containerID="898ba38dd4155308c84892e88f95de789cf04f7d3785c077484dd274b8da3740" exitCode=0 Nov 27 16:34:09 crc kubenswrapper[4707]: I1127 16:34:09.653012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" event={"ID":"29ac3bd9-0e37-4e00-aa44-a09c01019b96","Type":"ContainerDied","Data":"898ba38dd4155308c84892e88f95de789cf04f7d3785c077484dd274b8da3740"} Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.175163 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.370719 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-inventory-0\") pod \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.370788 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-ssh-key-openstack-edpm-ipam\") pod \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.370984 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lbw6\" (UniqueName: \"kubernetes.io/projected/29ac3bd9-0e37-4e00-aa44-a09c01019b96-kube-api-access-9lbw6\") pod \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\" (UID: \"29ac3bd9-0e37-4e00-aa44-a09c01019b96\") " Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.376830 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ac3bd9-0e37-4e00-aa44-a09c01019b96-kube-api-access-9lbw6" (OuterVolumeSpecName: "kube-api-access-9lbw6") pod "29ac3bd9-0e37-4e00-aa44-a09c01019b96" (UID: "29ac3bd9-0e37-4e00-aa44-a09c01019b96"). InnerVolumeSpecName "kube-api-access-9lbw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.402748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "29ac3bd9-0e37-4e00-aa44-a09c01019b96" (UID: "29ac3bd9-0e37-4e00-aa44-a09c01019b96"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.408578 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "29ac3bd9-0e37-4e00-aa44-a09c01019b96" (UID: "29ac3bd9-0e37-4e00-aa44-a09c01019b96"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.474651 4707 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.474901 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29ac3bd9-0e37-4e00-aa44-a09c01019b96-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.474913 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lbw6\" (UniqueName: \"kubernetes.io/projected/29ac3bd9-0e37-4e00-aa44-a09c01019b96-kube-api-access-9lbw6\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.681360 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" event={"ID":"29ac3bd9-0e37-4e00-aa44-a09c01019b96","Type":"ContainerDied","Data":"e91f5a0d2a460d00f9e34c4aa26c74b7f164e2d41adea7e31f2b4d12fd8a6378"} Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.681733 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e91f5a0d2a460d00f9e34c4aa26c74b7f164e2d41adea7e31f2b4d12fd8a6378" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.681470 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-9kfcd" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.771998 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7"] Nov 27 16:34:11 crc kubenswrapper[4707]: E1127 16:34:11.772766 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ac3bd9-0e37-4e00-aa44-a09c01019b96" containerName="ssh-known-hosts-edpm-deployment" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.772896 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ac3bd9-0e37-4e00-aa44-a09c01019b96" containerName="ssh-known-hosts-edpm-deployment" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.773323 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ac3bd9-0e37-4e00-aa44-a09c01019b96" containerName="ssh-known-hosts-edpm-deployment" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.774220 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.778511 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.778841 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.778982 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.781832 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhjsr\" (UniqueName: \"kubernetes.io/projected/a03595a4-c76f-4642-b492-17f393096888-kube-api-access-dhjsr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.781983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.782122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.791217 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7"] Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.809845 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.883763 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhjsr\" (UniqueName: \"kubernetes.io/projected/a03595a4-c76f-4642-b492-17f393096888-kube-api-access-dhjsr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.884191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.884585 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.889306 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.891767 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:11 crc kubenswrapper[4707]: I1127 16:34:11.913464 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhjsr\" (UniqueName: \"kubernetes.io/projected/a03595a4-c76f-4642-b492-17f393096888-kube-api-access-dhjsr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bf2x7\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:12 crc kubenswrapper[4707]: I1127 16:34:12.127708 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:12 crc kubenswrapper[4707]: I1127 16:34:12.822271 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7"] Nov 27 16:34:13 crc kubenswrapper[4707]: I1127 16:34:13.197571 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:34:13 crc kubenswrapper[4707]: E1127 16:34:13.200616 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:34:13 crc kubenswrapper[4707]: I1127 16:34:13.669868 4707 scope.go:117] "RemoveContainer" containerID="dc692054023f378f46ac0708e97db7997e183cbc95de67455c38d5a3dd56bd73" Nov 27 16:34:13 crc kubenswrapper[4707]: I1127 16:34:13.701071 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" event={"ID":"a03595a4-c76f-4642-b492-17f393096888","Type":"ContainerStarted","Data":"4ea892566f6a66ddb425c6985ac6ff1f49e498d0f03d9fd8c46ecf382d17a56e"} Nov 27 16:34:13 crc kubenswrapper[4707]: I1127 16:34:13.701321 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" event={"ID":"a03595a4-c76f-4642-b492-17f393096888","Type":"ContainerStarted","Data":"23912075d62a75aa737aee46dae91f345b7925b47f002ccf169fe0f15fa2cb6f"} Nov 27 16:34:13 crc kubenswrapper[4707]: I1127 16:34:13.727695 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" podStartSLOduration=2.212971177 podStartE2EDuration="2.727671231s" podCreationTimestamp="2025-11-27 16:34:11 +0000 UTC" firstStartedPulling="2025-11-27 16:34:12.848620824 +0000 UTC m=+1828.480069612" lastFinishedPulling="2025-11-27 16:34:13.363320888 +0000 UTC m=+1828.994769666" observedRunningTime="2025-11-27 16:34:13.723621083 +0000 UTC m=+1829.355069891" watchObservedRunningTime="2025-11-27 16:34:13.727671231 +0000 UTC m=+1829.359119999" Nov 27 16:34:22 crc kubenswrapper[4707]: I1127 16:34:22.803759 4707 generic.go:334] "Generic (PLEG): container finished" podID="a03595a4-c76f-4642-b492-17f393096888" containerID="4ea892566f6a66ddb425c6985ac6ff1f49e498d0f03d9fd8c46ecf382d17a56e" exitCode=0 Nov 27 16:34:22 crc kubenswrapper[4707]: I1127 16:34:22.803890 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" event={"ID":"a03595a4-c76f-4642-b492-17f393096888","Type":"ContainerDied","Data":"4ea892566f6a66ddb425c6985ac6ff1f49e498d0f03d9fd8c46ecf382d17a56e"} Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.280105 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.368576 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-ssh-key\") pod \"a03595a4-c76f-4642-b492-17f393096888\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.368726 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-inventory\") pod \"a03595a4-c76f-4642-b492-17f393096888\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.368811 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhjsr\" (UniqueName: \"kubernetes.io/projected/a03595a4-c76f-4642-b492-17f393096888-kube-api-access-dhjsr\") pod \"a03595a4-c76f-4642-b492-17f393096888\" (UID: \"a03595a4-c76f-4642-b492-17f393096888\") " Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.375805 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a03595a4-c76f-4642-b492-17f393096888-kube-api-access-dhjsr" (OuterVolumeSpecName: "kube-api-access-dhjsr") pod "a03595a4-c76f-4642-b492-17f393096888" (UID: "a03595a4-c76f-4642-b492-17f393096888"). InnerVolumeSpecName "kube-api-access-dhjsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.406153 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-inventory" (OuterVolumeSpecName: "inventory") pod "a03595a4-c76f-4642-b492-17f393096888" (UID: "a03595a4-c76f-4642-b492-17f393096888"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.423767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a03595a4-c76f-4642-b492-17f393096888" (UID: "a03595a4-c76f-4642-b492-17f393096888"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.471190 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.471233 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a03595a4-c76f-4642-b492-17f393096888-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.471248 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhjsr\" (UniqueName: \"kubernetes.io/projected/a03595a4-c76f-4642-b492-17f393096888-kube-api-access-dhjsr\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.829977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" event={"ID":"a03595a4-c76f-4642-b492-17f393096888","Type":"ContainerDied","Data":"23912075d62a75aa737aee46dae91f345b7925b47f002ccf169fe0f15fa2cb6f"} Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.830024 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23912075d62a75aa737aee46dae91f345b7925b47f002ccf169fe0f15fa2cb6f" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.830033 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bf2x7" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.907334 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm"] Nov 27 16:34:24 crc kubenswrapper[4707]: E1127 16:34:24.907794 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a03595a4-c76f-4642-b492-17f393096888" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.907813 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a03595a4-c76f-4642-b492-17f393096888" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.908008 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a03595a4-c76f-4642-b492-17f393096888" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.908659 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.913413 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.913838 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.914040 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.914446 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.930050 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm"] Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.984865 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66vgc\" (UniqueName: \"kubernetes.io/projected/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-kube-api-access-66vgc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.984913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:24 crc kubenswrapper[4707]: I1127 16:34:24.985180 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.087049 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66vgc\" (UniqueName: \"kubernetes.io/projected/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-kube-api-access-66vgc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.087101 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.087191 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.092227 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.092769 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.109148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66vgc\" (UniqueName: \"kubernetes.io/projected/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-kube-api-access-66vgc\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.226126 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.786039 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm"] Nov 27 16:34:25 crc kubenswrapper[4707]: I1127 16:34:25.841457 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" event={"ID":"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a","Type":"ContainerStarted","Data":"96e93f14e416d0069fe790fddf7f2bd69c966e7158f9df2b453b8a271c942394"} Nov 27 16:34:26 crc kubenswrapper[4707]: I1127 16:34:26.854971 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" event={"ID":"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a","Type":"ContainerStarted","Data":"052c2f324abbc4282043b2ff13c413b2e9659cf886e5d4eb6eb6e94623172b8d"} Nov 27 16:34:26 crc kubenswrapper[4707]: I1127 16:34:26.885225 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" podStartSLOduration=2.408219776 podStartE2EDuration="2.88520578s" podCreationTimestamp="2025-11-27 16:34:24 +0000 UTC" firstStartedPulling="2025-11-27 16:34:25.783053906 +0000 UTC m=+1841.414502684" lastFinishedPulling="2025-11-27 16:34:26.26003991 +0000 UTC m=+1841.891488688" observedRunningTime="2025-11-27 16:34:26.874067789 +0000 UTC m=+1842.505516587" watchObservedRunningTime="2025-11-27 16:34:26.88520578 +0000 UTC m=+1842.516654558" Nov 27 16:34:27 crc kubenswrapper[4707]: I1127 16:34:27.195477 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:34:27 crc kubenswrapper[4707]: E1127 16:34:27.195748 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:34:36 crc kubenswrapper[4707]: I1127 16:34:36.969764 4707 generic.go:334] "Generic (PLEG): container finished" podID="d89ddbb0-c0d3-46a8-a81e-ce2809f1352a" containerID="052c2f324abbc4282043b2ff13c413b2e9659cf886e5d4eb6eb6e94623172b8d" exitCode=0 Nov 27 16:34:36 crc kubenswrapper[4707]: I1127 16:34:36.969897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" event={"ID":"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a","Type":"ContainerDied","Data":"052c2f324abbc4282043b2ff13c413b2e9659cf886e5d4eb6eb6e94623172b8d"} Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.547618 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.631514 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66vgc\" (UniqueName: \"kubernetes.io/projected/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-kube-api-access-66vgc\") pod \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.631594 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-ssh-key\") pod \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.631713 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-inventory\") pod \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\" (UID: \"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a\") " Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.637159 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-kube-api-access-66vgc" (OuterVolumeSpecName: "kube-api-access-66vgc") pod "d89ddbb0-c0d3-46a8-a81e-ce2809f1352a" (UID: "d89ddbb0-c0d3-46a8-a81e-ce2809f1352a"). InnerVolumeSpecName "kube-api-access-66vgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.681116 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d89ddbb0-c0d3-46a8-a81e-ce2809f1352a" (UID: "d89ddbb0-c0d3-46a8-a81e-ce2809f1352a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.682576 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-inventory" (OuterVolumeSpecName: "inventory") pod "d89ddbb0-c0d3-46a8-a81e-ce2809f1352a" (UID: "d89ddbb0-c0d3-46a8-a81e-ce2809f1352a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.733541 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.733580 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66vgc\" (UniqueName: \"kubernetes.io/projected/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-kube-api-access-66vgc\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.733595 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d89ddbb0-c0d3-46a8-a81e-ce2809f1352a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.991338 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" event={"ID":"d89ddbb0-c0d3-46a8-a81e-ce2809f1352a","Type":"ContainerDied","Data":"96e93f14e416d0069fe790fddf7f2bd69c966e7158f9df2b453b8a271c942394"} Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.991396 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e93f14e416d0069fe790fddf7f2bd69c966e7158f9df2b453b8a271c942394" Nov 27 16:34:38 crc kubenswrapper[4707]: I1127 16:34:38.991480 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.136735 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c"] Nov 27 16:34:39 crc kubenswrapper[4707]: E1127 16:34:39.137475 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89ddbb0-c0d3-46a8-a81e-ce2809f1352a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.137570 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89ddbb0-c0d3-46a8-a81e-ce2809f1352a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.138106 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89ddbb0-c0d3-46a8-a81e-ce2809f1352a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.139127 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.142193 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.142583 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.142708 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.142817 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143062 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143122 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143163 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143177 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xbx6\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-kube-api-access-8xbx6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143334 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143428 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143463 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143503 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143546 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.143597 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.144104 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.144212 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.147049 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.147189 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.161499 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c"] Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.195300 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:34:39 crc kubenswrapper[4707]: E1127 16:34:39.195544 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245252 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245314 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245357 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245417 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245476 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245498 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xbx6\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-kube-api-access-8xbx6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245636 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245712 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245740 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245774 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.245825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.248989 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.250222 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.252943 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.254880 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.255053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.255200 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.255760 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.255778 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.256674 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.256763 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.257210 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.258612 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.266462 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.269726 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xbx6\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-kube-api-access-8xbx6\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bf29c\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:39 crc kubenswrapper[4707]: I1127 16:34:39.463602 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:34:40 crc kubenswrapper[4707]: I1127 16:34:40.001705 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c"] Nov 27 16:34:41 crc kubenswrapper[4707]: I1127 16:34:41.023489 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" event={"ID":"7382c94e-e799-4343-8548-7efd92ed66e8","Type":"ContainerStarted","Data":"17b0ac4513e55ae6269bad86cff7fbc2693ee67b81101698fd2388a11742a1d0"} Nov 27 16:34:41 crc kubenswrapper[4707]: I1127 16:34:41.024061 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" event={"ID":"7382c94e-e799-4343-8548-7efd92ed66e8","Type":"ContainerStarted","Data":"bd0ed09acd8bc5b3c75d9ba42085ca82a8cdc756d6c8ce6f087d07f076e98a67"} Nov 27 16:34:41 crc kubenswrapper[4707]: I1127 16:34:41.053176 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" podStartSLOduration=1.345271543 podStartE2EDuration="2.053152113s" podCreationTimestamp="2025-11-27 16:34:39 +0000 UTC" firstStartedPulling="2025-11-27 16:34:40.0075625 +0000 UTC m=+1855.639011298" lastFinishedPulling="2025-11-27 16:34:40.7154431 +0000 UTC m=+1856.346891868" observedRunningTime="2025-11-27 16:34:41.044047411 +0000 UTC m=+1856.675496179" watchObservedRunningTime="2025-11-27 16:34:41.053152113 +0000 UTC m=+1856.684600881" Nov 27 16:34:50 crc kubenswrapper[4707]: I1127 16:34:50.195194 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:34:50 crc kubenswrapper[4707]: E1127 16:34:50.197133 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:35:01 crc kubenswrapper[4707]: I1127 16:35:01.195699 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:35:01 crc kubenswrapper[4707]: E1127 16:35:01.196800 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:35:15 crc kubenswrapper[4707]: I1127 16:35:15.207045 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:35:15 crc kubenswrapper[4707]: E1127 16:35:15.207857 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:35:23 crc kubenswrapper[4707]: I1127 16:35:23.458008 4707 generic.go:334] "Generic (PLEG): container finished" podID="7382c94e-e799-4343-8548-7efd92ed66e8" containerID="17b0ac4513e55ae6269bad86cff7fbc2693ee67b81101698fd2388a11742a1d0" exitCode=0 Nov 27 16:35:23 crc kubenswrapper[4707]: I1127 16:35:23.458133 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" event={"ID":"7382c94e-e799-4343-8548-7efd92ed66e8","Type":"ContainerDied","Data":"17b0ac4513e55ae6269bad86cff7fbc2693ee67b81101698fd2388a11742a1d0"} Nov 27 16:35:24 crc kubenswrapper[4707]: I1127 16:35:24.959119 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.112777 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-repo-setup-combined-ca-bundle\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.112816 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-telemetry-combined-ca-bundle\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.112852 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.112895 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.112929 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-inventory\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.112945 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ssh-key\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.112961 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xbx6\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-kube-api-access-8xbx6\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.113001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ovn-combined-ca-bundle\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.113024 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.113076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-libvirt-combined-ca-bundle\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.113094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-bootstrap-combined-ca-bundle\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.113125 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.113155 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-neutron-metadata-combined-ca-bundle\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.113170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-nova-combined-ca-bundle\") pod \"7382c94e-e799-4343-8548-7efd92ed66e8\" (UID: \"7382c94e-e799-4343-8548-7efd92ed66e8\") " Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.119959 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.120052 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.120053 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.120129 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.120161 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.120400 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-kube-api-access-8xbx6" (OuterVolumeSpecName: "kube-api-access-8xbx6") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "kube-api-access-8xbx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.120694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.120807 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.121459 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.122394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.122804 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.124843 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.142136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-inventory" (OuterVolumeSpecName: "inventory") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.144704 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7382c94e-e799-4343-8548-7efd92ed66e8" (UID: "7382c94e-e799-4343-8548-7efd92ed66e8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215065 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215097 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215108 4707 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215117 4707 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215127 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215137 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215146 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215154 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215162 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xbx6\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-kube-api-access-8xbx6\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215172 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215181 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215193 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215205 4707 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7382c94e-e799-4343-8548-7efd92ed66e8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.215214 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7382c94e-e799-4343-8548-7efd92ed66e8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.480684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" event={"ID":"7382c94e-e799-4343-8548-7efd92ed66e8","Type":"ContainerDied","Data":"bd0ed09acd8bc5b3c75d9ba42085ca82a8cdc756d6c8ce6f087d07f076e98a67"} Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.480991 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd0ed09acd8bc5b3c75d9ba42085ca82a8cdc756d6c8ce6f087d07f076e98a67" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.480798 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bf29c" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.625350 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs"] Nov 27 16:35:25 crc kubenswrapper[4707]: E1127 16:35:25.625821 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7382c94e-e799-4343-8548-7efd92ed66e8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.625842 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="7382c94e-e799-4343-8548-7efd92ed66e8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.626114 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="7382c94e-e799-4343-8548-7efd92ed66e8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.627076 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.630898 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.631106 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.631319 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.631751 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.632220 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.646544 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs"] Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.736940 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.737347 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrtgk\" (UniqueName: \"kubernetes.io/projected/6001ceb1-ba83-4942-a49c-d7a6116f57f5-kube-api-access-jrtgk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.737403 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.737485 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.737606 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.839336 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.839453 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrtgk\" (UniqueName: \"kubernetes.io/projected/6001ceb1-ba83-4942-a49c-d7a6116f57f5-kube-api-access-jrtgk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.839494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.839562 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.839646 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.841157 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.843821 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.844090 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.845728 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.857787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrtgk\" (UniqueName: \"kubernetes.io/projected/6001ceb1-ba83-4942-a49c-d7a6116f57f5-kube-api-access-jrtgk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-wffqs\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:25 crc kubenswrapper[4707]: I1127 16:35:25.952639 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:35:26 crc kubenswrapper[4707]: I1127 16:35:26.482510 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs"] Nov 27 16:35:26 crc kubenswrapper[4707]: I1127 16:35:26.509937 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" event={"ID":"6001ceb1-ba83-4942-a49c-d7a6116f57f5","Type":"ContainerStarted","Data":"7c36365a9f419f59fbcba6b516a44e1b24f647cdd55e836167b9f332122e2628"} Nov 27 16:35:28 crc kubenswrapper[4707]: I1127 16:35:28.534418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" event={"ID":"6001ceb1-ba83-4942-a49c-d7a6116f57f5","Type":"ContainerStarted","Data":"09f979ce87ed8b44036e3af50df4e08e5351745d348211728b2e4f23e93c3788"} Nov 27 16:35:28 crc kubenswrapper[4707]: I1127 16:35:28.554023 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" podStartSLOduration=2.3587150709999998 podStartE2EDuration="3.554005498s" podCreationTimestamp="2025-11-27 16:35:25 +0000 UTC" firstStartedPulling="2025-11-27 16:35:26.48283801 +0000 UTC m=+1902.114286778" lastFinishedPulling="2025-11-27 16:35:27.678128397 +0000 UTC m=+1903.309577205" observedRunningTime="2025-11-27 16:35:28.549883887 +0000 UTC m=+1904.181332655" watchObservedRunningTime="2025-11-27 16:35:28.554005498 +0000 UTC m=+1904.185454266" Nov 27 16:35:29 crc kubenswrapper[4707]: I1127 16:35:29.198335 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:35:29 crc kubenswrapper[4707]: E1127 16:35:29.199303 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:35:41 crc kubenswrapper[4707]: I1127 16:35:41.195903 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:35:41 crc kubenswrapper[4707]: I1127 16:35:41.676674 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"72f6c6a465def83be9b7fe8fbb6add62002cd9afab0e6d69d08f283f6d9f8a4c"} Nov 27 16:36:33 crc kubenswrapper[4707]: I1127 16:36:33.196652 4707 generic.go:334] "Generic (PLEG): container finished" podID="6001ceb1-ba83-4942-a49c-d7a6116f57f5" containerID="09f979ce87ed8b44036e3af50df4e08e5351745d348211728b2e4f23e93c3788" exitCode=0 Nov 27 16:36:33 crc kubenswrapper[4707]: I1127 16:36:33.208599 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" event={"ID":"6001ceb1-ba83-4942-a49c-d7a6116f57f5","Type":"ContainerDied","Data":"09f979ce87ed8b44036e3af50df4e08e5351745d348211728b2e4f23e93c3788"} Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.652749 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.709051 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovncontroller-config-0\") pod \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.709217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovn-combined-ca-bundle\") pod \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.709250 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ssh-key\") pod \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.709275 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-inventory\") pod \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.710327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrtgk\" (UniqueName: \"kubernetes.io/projected/6001ceb1-ba83-4942-a49c-d7a6116f57f5-kube-api-access-jrtgk\") pod \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\" (UID: \"6001ceb1-ba83-4942-a49c-d7a6116f57f5\") " Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.715756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6001ceb1-ba83-4942-a49c-d7a6116f57f5-kube-api-access-jrtgk" (OuterVolumeSpecName: "kube-api-access-jrtgk") pod "6001ceb1-ba83-4942-a49c-d7a6116f57f5" (UID: "6001ceb1-ba83-4942-a49c-d7a6116f57f5"). InnerVolumeSpecName "kube-api-access-jrtgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.724097 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6001ceb1-ba83-4942-a49c-d7a6116f57f5" (UID: "6001ceb1-ba83-4942-a49c-d7a6116f57f5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.738101 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-inventory" (OuterVolumeSpecName: "inventory") pod "6001ceb1-ba83-4942-a49c-d7a6116f57f5" (UID: "6001ceb1-ba83-4942-a49c-d7a6116f57f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.740096 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6001ceb1-ba83-4942-a49c-d7a6116f57f5" (UID: "6001ceb1-ba83-4942-a49c-d7a6116f57f5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.746863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6001ceb1-ba83-4942-a49c-d7a6116f57f5" (UID: "6001ceb1-ba83-4942-a49c-d7a6116f57f5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.813254 4707 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.813293 4707 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.813308 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.813322 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6001ceb1-ba83-4942-a49c-d7a6116f57f5-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:36:34 crc kubenswrapper[4707]: I1127 16:36:34.813334 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrtgk\" (UniqueName: \"kubernetes.io/projected/6001ceb1-ba83-4942-a49c-d7a6116f57f5-kube-api-access-jrtgk\") on node \"crc\" DevicePath \"\"" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.219250 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" event={"ID":"6001ceb1-ba83-4942-a49c-d7a6116f57f5","Type":"ContainerDied","Data":"7c36365a9f419f59fbcba6b516a44e1b24f647cdd55e836167b9f332122e2628"} Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.219308 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c36365a9f419f59fbcba6b516a44e1b24f647cdd55e836167b9f332122e2628" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.219320 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-wffqs" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.342668 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w"] Nov 27 16:36:35 crc kubenswrapper[4707]: E1127 16:36:35.343182 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6001ceb1-ba83-4942-a49c-d7a6116f57f5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.343199 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6001ceb1-ba83-4942-a49c-d7a6116f57f5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.343388 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6001ceb1-ba83-4942-a49c-d7a6116f57f5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.344141 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.347225 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.348728 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.350739 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.350778 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.350758 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.350787 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.372327 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w"] Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.428200 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.428295 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.428479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjvl2\" (UniqueName: \"kubernetes.io/projected/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-kube-api-access-mjvl2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.428545 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.428714 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.428753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.530307 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.530362 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.530447 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.530510 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.530549 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjvl2\" (UniqueName: \"kubernetes.io/projected/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-kube-api-access-mjvl2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.530577 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.535645 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.535816 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.536725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.537285 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.538807 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.547710 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjvl2\" (UniqueName: \"kubernetes.io/projected/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-kube-api-access-mjvl2\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:35 crc kubenswrapper[4707]: I1127 16:36:35.675680 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:36:36 crc kubenswrapper[4707]: I1127 16:36:36.183667 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w"] Nov 27 16:36:36 crc kubenswrapper[4707]: I1127 16:36:36.193052 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:36:36 crc kubenswrapper[4707]: I1127 16:36:36.229906 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" event={"ID":"9f5df211-3c1b-45f3-9b61-a7fde58d8a39","Type":"ContainerStarted","Data":"b940864a19d085ab32816ad8400c1931c32f31aa1757f0161ba2a585a085573f"} Nov 27 16:36:37 crc kubenswrapper[4707]: I1127 16:36:37.242581 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" event={"ID":"9f5df211-3c1b-45f3-9b61-a7fde58d8a39","Type":"ContainerStarted","Data":"ee0614fd688fe95213408e59f8e4df140f1e08a5890756e2c89dafadc13bf21c"} Nov 27 16:36:37 crc kubenswrapper[4707]: I1127 16:36:37.259856 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" podStartSLOduration=1.8223738269999998 podStartE2EDuration="2.259840626s" podCreationTimestamp="2025-11-27 16:36:35 +0000 UTC" firstStartedPulling="2025-11-27 16:36:36.192743693 +0000 UTC m=+1971.824192471" lastFinishedPulling="2025-11-27 16:36:36.630210502 +0000 UTC m=+1972.261659270" observedRunningTime="2025-11-27 16:36:37.25635565 +0000 UTC m=+1972.887804418" watchObservedRunningTime="2025-11-27 16:36:37.259840626 +0000 UTC m=+1972.891289394" Nov 27 16:37:26 crc kubenswrapper[4707]: I1127 16:37:26.716223 4707 generic.go:334] "Generic (PLEG): container finished" podID="9f5df211-3c1b-45f3-9b61-a7fde58d8a39" containerID="ee0614fd688fe95213408e59f8e4df140f1e08a5890756e2c89dafadc13bf21c" exitCode=0 Nov 27 16:37:26 crc kubenswrapper[4707]: I1127 16:37:26.716353 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" event={"ID":"9f5df211-3c1b-45f3-9b61-a7fde58d8a39","Type":"ContainerDied","Data":"ee0614fd688fe95213408e59f8e4df140f1e08a5890756e2c89dafadc13bf21c"} Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.181425 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.352016 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-ssh-key\") pod \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.352542 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.352602 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-metadata-combined-ca-bundle\") pod \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.352684 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-nova-metadata-neutron-config-0\") pod \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.352701 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjvl2\" (UniqueName: \"kubernetes.io/projected/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-kube-api-access-mjvl2\") pod \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.352743 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-inventory\") pod \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\" (UID: \"9f5df211-3c1b-45f3-9b61-a7fde58d8a39\") " Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.361802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9f5df211-3c1b-45f3-9b61-a7fde58d8a39" (UID: "9f5df211-3c1b-45f3-9b61-a7fde58d8a39"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.378461 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-kube-api-access-mjvl2" (OuterVolumeSpecName: "kube-api-access-mjvl2") pod "9f5df211-3c1b-45f3-9b61-a7fde58d8a39" (UID: "9f5df211-3c1b-45f3-9b61-a7fde58d8a39"). InnerVolumeSpecName "kube-api-access-mjvl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.395436 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9f5df211-3c1b-45f3-9b61-a7fde58d8a39" (UID: "9f5df211-3c1b-45f3-9b61-a7fde58d8a39"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.399247 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-inventory" (OuterVolumeSpecName: "inventory") pod "9f5df211-3c1b-45f3-9b61-a7fde58d8a39" (UID: "9f5df211-3c1b-45f3-9b61-a7fde58d8a39"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.400881 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f5df211-3c1b-45f3-9b61-a7fde58d8a39" (UID: "9f5df211-3c1b-45f3-9b61-a7fde58d8a39"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.404908 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9f5df211-3c1b-45f3-9b61-a7fde58d8a39" (UID: "9f5df211-3c1b-45f3-9b61-a7fde58d8a39"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.457166 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjvl2\" (UniqueName: \"kubernetes.io/projected/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-kube-api-access-mjvl2\") on node \"crc\" DevicePath \"\"" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.457206 4707 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.457223 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.457238 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.457250 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.457268 4707 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5df211-3c1b-45f3-9b61-a7fde58d8a39-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.747536 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" event={"ID":"9f5df211-3c1b-45f3-9b61-a7fde58d8a39","Type":"ContainerDied","Data":"b940864a19d085ab32816ad8400c1931c32f31aa1757f0161ba2a585a085573f"} Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.747574 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b940864a19d085ab32816ad8400c1931c32f31aa1757f0161ba2a585a085573f" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.747574 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.917624 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq"] Nov 27 16:37:28 crc kubenswrapper[4707]: E1127 16:37:28.918327 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5df211-3c1b-45f3-9b61-a7fde58d8a39" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.918453 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5df211-3c1b-45f3-9b61-a7fde58d8a39" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.918818 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5df211-3c1b-45f3-9b61-a7fde58d8a39" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.919718 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.922621 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.922753 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.923140 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.923765 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.925684 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:37:28 crc kubenswrapper[4707]: I1127 16:37:28.929302 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq"] Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.072923 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.073073 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.073113 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mpkc\" (UniqueName: \"kubernetes.io/projected/18fd2519-f36c-4817-85da-7615979c3340-kube-api-access-5mpkc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.073171 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.073209 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.175679 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.175825 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.175867 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mpkc\" (UniqueName: \"kubernetes.io/projected/18fd2519-f36c-4817-85da-7615979c3340-kube-api-access-5mpkc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.175932 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.175972 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.183786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.184926 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.185670 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.192494 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.197930 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mpkc\" (UniqueName: \"kubernetes.io/projected/18fd2519-f36c-4817-85da-7615979c3340-kube-api-access-5mpkc\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.241094 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.602555 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq"] Nov 27 16:37:29 crc kubenswrapper[4707]: I1127 16:37:29.758085 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" event={"ID":"18fd2519-f36c-4817-85da-7615979c3340","Type":"ContainerStarted","Data":"5faafaf85956e94bd183cbd1ce3d76ce3975ea0b41e4d4459786b73dbc8f6d14"} Nov 27 16:37:30 crc kubenswrapper[4707]: I1127 16:37:30.771045 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" event={"ID":"18fd2519-f36c-4817-85da-7615979c3340","Type":"ContainerStarted","Data":"47d550117d2c93fb4b4d5382b48a90a096a3bc34693686b1ca38e95816223ad3"} Nov 27 16:37:30 crc kubenswrapper[4707]: I1127 16:37:30.809115 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" podStartSLOduration=2.315308879 podStartE2EDuration="2.80908502s" podCreationTimestamp="2025-11-27 16:37:28 +0000 UTC" firstStartedPulling="2025-11-27 16:37:29.610055128 +0000 UTC m=+2025.241503896" lastFinishedPulling="2025-11-27 16:37:30.103831269 +0000 UTC m=+2025.735280037" observedRunningTime="2025-11-27 16:37:30.789649422 +0000 UTC m=+2026.421098230" watchObservedRunningTime="2025-11-27 16:37:30.80908502 +0000 UTC m=+2026.440533828" Nov 27 16:38:03 crc kubenswrapper[4707]: I1127 16:38:03.623841 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:38:03 crc kubenswrapper[4707]: I1127 16:38:03.624357 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:38:33 crc kubenswrapper[4707]: I1127 16:38:33.624201 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:38:33 crc kubenswrapper[4707]: I1127 16:38:33.624817 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.306078 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rbqsp"] Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.310312 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.322580 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rbqsp"] Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.461556 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-utilities\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.461654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-catalog-content\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.462015 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr2vf\" (UniqueName: \"kubernetes.io/projected/572be6ab-1f95-41c2-8fc1-6274b952f060-kube-api-access-jr2vf\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.563882 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr2vf\" (UniqueName: \"kubernetes.io/projected/572be6ab-1f95-41c2-8fc1-6274b952f060-kube-api-access-jr2vf\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.564066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-utilities\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.564093 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-catalog-content\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.564954 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-utilities\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.565000 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-catalog-content\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.599010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr2vf\" (UniqueName: \"kubernetes.io/projected/572be6ab-1f95-41c2-8fc1-6274b952f060-kube-api-access-jr2vf\") pod \"certified-operators-rbqsp\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:39 crc kubenswrapper[4707]: I1127 16:38:39.648817 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:40 crc kubenswrapper[4707]: I1127 16:38:40.183063 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rbqsp"] Nov 27 16:38:40 crc kubenswrapper[4707]: I1127 16:38:40.715682 4707 generic.go:334] "Generic (PLEG): container finished" podID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerID="708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05" exitCode=0 Nov 27 16:38:40 crc kubenswrapper[4707]: I1127 16:38:40.715787 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbqsp" event={"ID":"572be6ab-1f95-41c2-8fc1-6274b952f060","Type":"ContainerDied","Data":"708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05"} Nov 27 16:38:40 crc kubenswrapper[4707]: I1127 16:38:40.716047 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbqsp" event={"ID":"572be6ab-1f95-41c2-8fc1-6274b952f060","Type":"ContainerStarted","Data":"2b3e1487e920c1cbb83bf9763a68c8c7d2a0d6caba2cb64618381761b67d9234"} Nov 27 16:38:42 crc kubenswrapper[4707]: I1127 16:38:42.740668 4707 generic.go:334] "Generic (PLEG): container finished" podID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerID="a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4" exitCode=0 Nov 27 16:38:42 crc kubenswrapper[4707]: I1127 16:38:42.740833 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbqsp" event={"ID":"572be6ab-1f95-41c2-8fc1-6274b952f060","Type":"ContainerDied","Data":"a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4"} Nov 27 16:38:44 crc kubenswrapper[4707]: I1127 16:38:44.764103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbqsp" event={"ID":"572be6ab-1f95-41c2-8fc1-6274b952f060","Type":"ContainerStarted","Data":"c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082"} Nov 27 16:38:44 crc kubenswrapper[4707]: I1127 16:38:44.784341 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rbqsp" podStartSLOduration=2.966560527 podStartE2EDuration="5.784323291s" podCreationTimestamp="2025-11-27 16:38:39 +0000 UTC" firstStartedPulling="2025-11-27 16:38:40.720532039 +0000 UTC m=+2096.351980847" lastFinishedPulling="2025-11-27 16:38:43.538294843 +0000 UTC m=+2099.169743611" observedRunningTime="2025-11-27 16:38:44.782742092 +0000 UTC m=+2100.414190880" watchObservedRunningTime="2025-11-27 16:38:44.784323291 +0000 UTC m=+2100.415772079" Nov 27 16:38:49 crc kubenswrapper[4707]: I1127 16:38:49.649413 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:49 crc kubenswrapper[4707]: I1127 16:38:49.649979 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:49 crc kubenswrapper[4707]: I1127 16:38:49.703171 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:49 crc kubenswrapper[4707]: I1127 16:38:49.858527 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:49 crc kubenswrapper[4707]: I1127 16:38:49.945294 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rbqsp"] Nov 27 16:38:51 crc kubenswrapper[4707]: I1127 16:38:51.842635 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rbqsp" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerName="registry-server" containerID="cri-o://c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082" gracePeriod=2 Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.311476 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.463937 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr2vf\" (UniqueName: \"kubernetes.io/projected/572be6ab-1f95-41c2-8fc1-6274b952f060-kube-api-access-jr2vf\") pod \"572be6ab-1f95-41c2-8fc1-6274b952f060\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.465117 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-utilities\") pod \"572be6ab-1f95-41c2-8fc1-6274b952f060\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.465226 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-catalog-content\") pod \"572be6ab-1f95-41c2-8fc1-6274b952f060\" (UID: \"572be6ab-1f95-41c2-8fc1-6274b952f060\") " Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.466151 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-utilities" (OuterVolumeSpecName: "utilities") pod "572be6ab-1f95-41c2-8fc1-6274b952f060" (UID: "572be6ab-1f95-41c2-8fc1-6274b952f060"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.473613 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/572be6ab-1f95-41c2-8fc1-6274b952f060-kube-api-access-jr2vf" (OuterVolumeSpecName: "kube-api-access-jr2vf") pod "572be6ab-1f95-41c2-8fc1-6274b952f060" (UID: "572be6ab-1f95-41c2-8fc1-6274b952f060"). InnerVolumeSpecName "kube-api-access-jr2vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.529615 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "572be6ab-1f95-41c2-8fc1-6274b952f060" (UID: "572be6ab-1f95-41c2-8fc1-6274b952f060"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.567872 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr2vf\" (UniqueName: \"kubernetes.io/projected/572be6ab-1f95-41c2-8fc1-6274b952f060-kube-api-access-jr2vf\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.567906 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.567915 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572be6ab-1f95-41c2-8fc1-6274b952f060-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.867800 4707 generic.go:334] "Generic (PLEG): container finished" podID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerID="c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082" exitCode=0 Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.867856 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rbqsp" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.867845 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbqsp" event={"ID":"572be6ab-1f95-41c2-8fc1-6274b952f060","Type":"ContainerDied","Data":"c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082"} Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.868164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rbqsp" event={"ID":"572be6ab-1f95-41c2-8fc1-6274b952f060","Type":"ContainerDied","Data":"2b3e1487e920c1cbb83bf9763a68c8c7d2a0d6caba2cb64618381761b67d9234"} Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.868185 4707 scope.go:117] "RemoveContainer" containerID="c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.886193 4707 scope.go:117] "RemoveContainer" containerID="a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.902399 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rbqsp"] Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.909572 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rbqsp"] Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.924321 4707 scope.go:117] "RemoveContainer" containerID="708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.944173 4707 scope.go:117] "RemoveContainer" containerID="c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082" Nov 27 16:38:52 crc kubenswrapper[4707]: E1127 16:38:52.944625 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082\": container with ID starting with c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082 not found: ID does not exist" containerID="c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.944666 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082"} err="failed to get container status \"c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082\": rpc error: code = NotFound desc = could not find container \"c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082\": container with ID starting with c8fac96bd2a16e4b87b99c3871340ce05ba5697d63e5474cdc78e97a99e50082 not found: ID does not exist" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.944692 4707 scope.go:117] "RemoveContainer" containerID="a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4" Nov 27 16:38:52 crc kubenswrapper[4707]: E1127 16:38:52.944979 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4\": container with ID starting with a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4 not found: ID does not exist" containerID="a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.945010 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4"} err="failed to get container status \"a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4\": rpc error: code = NotFound desc = could not find container \"a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4\": container with ID starting with a6742f9e13fcfedc8d7da77ae92b656df00f550548c8ac3de4eff479587e69e4 not found: ID does not exist" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.945034 4707 scope.go:117] "RemoveContainer" containerID="708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05" Nov 27 16:38:52 crc kubenswrapper[4707]: E1127 16:38:52.945330 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05\": container with ID starting with 708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05 not found: ID does not exist" containerID="708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05" Nov 27 16:38:52 crc kubenswrapper[4707]: I1127 16:38:52.945410 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05"} err="failed to get container status \"708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05\": rpc error: code = NotFound desc = could not find container \"708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05\": container with ID starting with 708a575e62ffdc4ad80be34b600937ff8cc62d8d192ed20858893d46b05eed05 not found: ID does not exist" Nov 27 16:38:53 crc kubenswrapper[4707]: I1127 16:38:53.208344 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" path="/var/lib/kubelet/pods/572be6ab-1f95-41c2-8fc1-6274b952f060/volumes" Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.624190 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.624851 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.624920 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.626146 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72f6c6a465def83be9b7fe8fbb6add62002cd9afab0e6d69d08f283f6d9f8a4c"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.626246 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://72f6c6a465def83be9b7fe8fbb6add62002cd9afab0e6d69d08f283f6d9f8a4c" gracePeriod=600 Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.976525 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="72f6c6a465def83be9b7fe8fbb6add62002cd9afab0e6d69d08f283f6d9f8a4c" exitCode=0 Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.976597 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"72f6c6a465def83be9b7fe8fbb6add62002cd9afab0e6d69d08f283f6d9f8a4c"} Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.976969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11"} Nov 27 16:39:03 crc kubenswrapper[4707]: I1127 16:39:03.976992 4707 scope.go:117] "RemoveContainer" containerID="98accd0e2044449ac7795f23cc0c9b20e1cb9fd7b29855e0dc1d78765b3e2e59" Nov 27 16:41:03 crc kubenswrapper[4707]: I1127 16:41:03.623502 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:41:03 crc kubenswrapper[4707]: I1127 16:41:03.624127 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:41:33 crc kubenswrapper[4707]: I1127 16:41:33.623644 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:41:33 crc kubenswrapper[4707]: I1127 16:41:33.624218 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.547034 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9wxtr"] Nov 27 16:41:55 crc kubenswrapper[4707]: E1127 16:41:55.548101 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerName="extract-utilities" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.548117 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerName="extract-utilities" Nov 27 16:41:55 crc kubenswrapper[4707]: E1127 16:41:55.548141 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerName="extract-content" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.548151 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerName="extract-content" Nov 27 16:41:55 crc kubenswrapper[4707]: E1127 16:41:55.548175 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerName="registry-server" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.548183 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerName="registry-server" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.548468 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="572be6ab-1f95-41c2-8fc1-6274b952f060" containerName="registry-server" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.550132 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.562079 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wxtr"] Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.721192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-utilities\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.721792 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-catalog-content\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.721858 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9fkk\" (UniqueName: \"kubernetes.io/projected/a41316a9-e4d8-4406-88c2-4cf8883a79af-kube-api-access-w9fkk\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.824028 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-catalog-content\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.824097 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9fkk\" (UniqueName: \"kubernetes.io/projected/a41316a9-e4d8-4406-88c2-4cf8883a79af-kube-api-access-w9fkk\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.824217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-utilities\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.824608 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-catalog-content\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.824774 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-utilities\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.844535 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9fkk\" (UniqueName: \"kubernetes.io/projected/a41316a9-e4d8-4406-88c2-4cf8883a79af-kube-api-access-w9fkk\") pod \"redhat-marketplace-9wxtr\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:55 crc kubenswrapper[4707]: I1127 16:41:55.876121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:41:56 crc kubenswrapper[4707]: I1127 16:41:56.375924 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wxtr"] Nov 27 16:41:56 crc kubenswrapper[4707]: I1127 16:41:56.805098 4707 generic.go:334] "Generic (PLEG): container finished" podID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerID="5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5" exitCode=0 Nov 27 16:41:56 crc kubenswrapper[4707]: I1127 16:41:56.805227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wxtr" event={"ID":"a41316a9-e4d8-4406-88c2-4cf8883a79af","Type":"ContainerDied","Data":"5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5"} Nov 27 16:41:56 crc kubenswrapper[4707]: I1127 16:41:56.805519 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wxtr" event={"ID":"a41316a9-e4d8-4406-88c2-4cf8883a79af","Type":"ContainerStarted","Data":"013bc8bca458d7e7eb0ab1042a2b23d844e6bb55d5ee57e7d4b496a39d8a58b6"} Nov 27 16:41:56 crc kubenswrapper[4707]: I1127 16:41:56.807039 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:41:59 crc kubenswrapper[4707]: I1127 16:41:59.846165 4707 generic.go:334] "Generic (PLEG): container finished" podID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerID="f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a" exitCode=0 Nov 27 16:41:59 crc kubenswrapper[4707]: I1127 16:41:59.846211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wxtr" event={"ID":"a41316a9-e4d8-4406-88c2-4cf8883a79af","Type":"ContainerDied","Data":"f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a"} Nov 27 16:42:00 crc kubenswrapper[4707]: I1127 16:42:00.932192 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5rctg"] Nov 27 16:42:00 crc kubenswrapper[4707]: I1127 16:42:00.937215 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:00 crc kubenswrapper[4707]: I1127 16:42:00.957617 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rctg"] Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.032952 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-catalog-content\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.033203 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-utilities\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.033308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsx6k\" (UniqueName: \"kubernetes.io/projected/6e78461c-1a57-411d-b143-0c4160e11623-kube-api-access-dsx6k\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.135733 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-utilities\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.135842 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsx6k\" (UniqueName: \"kubernetes.io/projected/6e78461c-1a57-411d-b143-0c4160e11623-kube-api-access-dsx6k\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.135907 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-catalog-content\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.136302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-utilities\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.136438 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-catalog-content\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.159032 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsx6k\" (UniqueName: \"kubernetes.io/projected/6e78461c-1a57-411d-b143-0c4160e11623-kube-api-access-dsx6k\") pod \"redhat-operators-5rctg\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.257940 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.785685 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rctg"] Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.866893 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wxtr" event={"ID":"a41316a9-e4d8-4406-88c2-4cf8883a79af","Type":"ContainerStarted","Data":"d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d"} Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.868969 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rctg" event={"ID":"6e78461c-1a57-411d-b143-0c4160e11623","Type":"ContainerStarted","Data":"95bc4dd969d97b9a5569d563e7347c421c5a7ea654d643edc69ce865028ef765"} Nov 27 16:42:01 crc kubenswrapper[4707]: I1127 16:42:01.892732 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9wxtr" podStartSLOduration=2.984435002 podStartE2EDuration="6.892712525s" podCreationTimestamp="2025-11-27 16:41:55 +0000 UTC" firstStartedPulling="2025-11-27 16:41:56.806748721 +0000 UTC m=+2292.438197489" lastFinishedPulling="2025-11-27 16:42:00.715026204 +0000 UTC m=+2296.346475012" observedRunningTime="2025-11-27 16:42:01.883506147 +0000 UTC m=+2297.514954925" watchObservedRunningTime="2025-11-27 16:42:01.892712525 +0000 UTC m=+2297.524161293" Nov 27 16:42:02 crc kubenswrapper[4707]: I1127 16:42:02.893570 4707 generic.go:334] "Generic (PLEG): container finished" podID="6e78461c-1a57-411d-b143-0c4160e11623" containerID="fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a" exitCode=0 Nov 27 16:42:02 crc kubenswrapper[4707]: I1127 16:42:02.893662 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rctg" event={"ID":"6e78461c-1a57-411d-b143-0c4160e11623","Type":"ContainerDied","Data":"fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a"} Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.624444 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.624926 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.625004 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.626208 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.626329 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" gracePeriod=600 Nov 27 16:42:03 crc kubenswrapper[4707]: E1127 16:42:03.811742 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.906952 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" exitCode=0 Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.907019 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11"} Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.907076 4707 scope.go:117] "RemoveContainer" containerID="72f6c6a465def83be9b7fe8fbb6add62002cd9afab0e6d69d08f283f6d9f8a4c" Nov 27 16:42:03 crc kubenswrapper[4707]: I1127 16:42:03.907884 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:42:03 crc kubenswrapper[4707]: E1127 16:42:03.909122 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:42:04 crc kubenswrapper[4707]: I1127 16:42:04.919661 4707 generic.go:334] "Generic (PLEG): container finished" podID="6e78461c-1a57-411d-b143-0c4160e11623" containerID="5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e" exitCode=0 Nov 27 16:42:04 crc kubenswrapper[4707]: I1127 16:42:04.919767 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rctg" event={"ID":"6e78461c-1a57-411d-b143-0c4160e11623","Type":"ContainerDied","Data":"5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e"} Nov 27 16:42:05 crc kubenswrapper[4707]: I1127 16:42:05.876767 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:42:05 crc kubenswrapper[4707]: I1127 16:42:05.879658 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:42:05 crc kubenswrapper[4707]: I1127 16:42:05.939623 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:42:07 crc kubenswrapper[4707]: I1127 16:42:07.025559 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:42:07 crc kubenswrapper[4707]: I1127 16:42:07.953180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rctg" event={"ID":"6e78461c-1a57-411d-b143-0c4160e11623","Type":"ContainerStarted","Data":"103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8"} Nov 27 16:42:07 crc kubenswrapper[4707]: I1127 16:42:07.990815 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5rctg" podStartSLOduration=3.497984329 podStartE2EDuration="7.990787821s" podCreationTimestamp="2025-11-27 16:42:00 +0000 UTC" firstStartedPulling="2025-11-27 16:42:02.896692336 +0000 UTC m=+2298.528141144" lastFinishedPulling="2025-11-27 16:42:07.389495868 +0000 UTC m=+2303.020944636" observedRunningTime="2025-11-27 16:42:07.97859834 +0000 UTC m=+2303.610047188" watchObservedRunningTime="2025-11-27 16:42:07.990787821 +0000 UTC m=+2303.622236619" Nov 27 16:42:08 crc kubenswrapper[4707]: I1127 16:42:08.120687 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wxtr"] Nov 27 16:42:08 crc kubenswrapper[4707]: I1127 16:42:08.967164 4707 generic.go:334] "Generic (PLEG): container finished" podID="18fd2519-f36c-4817-85da-7615979c3340" containerID="47d550117d2c93fb4b4d5382b48a90a096a3bc34693686b1ca38e95816223ad3" exitCode=0 Nov 27 16:42:08 crc kubenswrapper[4707]: I1127 16:42:08.967276 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" event={"ID":"18fd2519-f36c-4817-85da-7615979c3340","Type":"ContainerDied","Data":"47d550117d2c93fb4b4d5382b48a90a096a3bc34693686b1ca38e95816223ad3"} Nov 27 16:42:08 crc kubenswrapper[4707]: I1127 16:42:08.967521 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9wxtr" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerName="registry-server" containerID="cri-o://d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d" gracePeriod=2 Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.464861 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.615295 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9fkk\" (UniqueName: \"kubernetes.io/projected/a41316a9-e4d8-4406-88c2-4cf8883a79af-kube-api-access-w9fkk\") pod \"a41316a9-e4d8-4406-88c2-4cf8883a79af\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.615861 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-catalog-content\") pod \"a41316a9-e4d8-4406-88c2-4cf8883a79af\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.616052 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-utilities\") pod \"a41316a9-e4d8-4406-88c2-4cf8883a79af\" (UID: \"a41316a9-e4d8-4406-88c2-4cf8883a79af\") " Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.617646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-utilities" (OuterVolumeSpecName: "utilities") pod "a41316a9-e4d8-4406-88c2-4cf8883a79af" (UID: "a41316a9-e4d8-4406-88c2-4cf8883a79af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.626657 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41316a9-e4d8-4406-88c2-4cf8883a79af-kube-api-access-w9fkk" (OuterVolumeSpecName: "kube-api-access-w9fkk") pod "a41316a9-e4d8-4406-88c2-4cf8883a79af" (UID: "a41316a9-e4d8-4406-88c2-4cf8883a79af"). InnerVolumeSpecName "kube-api-access-w9fkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.657894 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a41316a9-e4d8-4406-88c2-4cf8883a79af" (UID: "a41316a9-e4d8-4406-88c2-4cf8883a79af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.718248 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.718277 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a41316a9-e4d8-4406-88c2-4cf8883a79af-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.718288 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9fkk\" (UniqueName: \"kubernetes.io/projected/a41316a9-e4d8-4406-88c2-4cf8883a79af-kube-api-access-w9fkk\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.982633 4707 generic.go:334] "Generic (PLEG): container finished" podID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerID="d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d" exitCode=0 Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.982693 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wxtr" event={"ID":"a41316a9-e4d8-4406-88c2-4cf8883a79af","Type":"ContainerDied","Data":"d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d"} Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.982732 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9wxtr" Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.982759 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9wxtr" event={"ID":"a41316a9-e4d8-4406-88c2-4cf8883a79af","Type":"ContainerDied","Data":"013bc8bca458d7e7eb0ab1042a2b23d844e6bb55d5ee57e7d4b496a39d8a58b6"} Nov 27 16:42:09 crc kubenswrapper[4707]: I1127 16:42:09.982783 4707 scope.go:117] "RemoveContainer" containerID="d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.021792 4707 scope.go:117] "RemoveContainer" containerID="f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.054248 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wxtr"] Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.062144 4707 scope.go:117] "RemoveContainer" containerID="5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.064382 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9wxtr"] Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.110363 4707 scope.go:117] "RemoveContainer" containerID="d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d" Nov 27 16:42:10 crc kubenswrapper[4707]: E1127 16:42:10.110965 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d\": container with ID starting with d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d not found: ID does not exist" containerID="d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.111008 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d"} err="failed to get container status \"d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d\": rpc error: code = NotFound desc = could not find container \"d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d\": container with ID starting with d0f0f5226bad182101bc958c99ef36a78628e6dfe3d83c5cfbf68d8f2f9a897d not found: ID does not exist" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.111041 4707 scope.go:117] "RemoveContainer" containerID="f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a" Nov 27 16:42:10 crc kubenswrapper[4707]: E1127 16:42:10.111418 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a\": container with ID starting with f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a not found: ID does not exist" containerID="f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.111450 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a"} err="failed to get container status \"f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a\": rpc error: code = NotFound desc = could not find container \"f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a\": container with ID starting with f659c7481e7789d424648cda43e4e3e4df12d3d064ea003c2e45738d34dd4c5a not found: ID does not exist" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.111479 4707 scope.go:117] "RemoveContainer" containerID="5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5" Nov 27 16:42:10 crc kubenswrapper[4707]: E1127 16:42:10.112024 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5\": container with ID starting with 5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5 not found: ID does not exist" containerID="5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.112046 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5"} err="failed to get container status \"5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5\": rpc error: code = NotFound desc = could not find container \"5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5\": container with ID starting with 5684fd0a354e731706a6823df9a04b9d7b6870758e1b9eb78a76c35962ce22a5 not found: ID does not exist" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.462841 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.534380 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-ssh-key\") pod \"18fd2519-f36c-4817-85da-7615979c3340\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.534452 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mpkc\" (UniqueName: \"kubernetes.io/projected/18fd2519-f36c-4817-85da-7615979c3340-kube-api-access-5mpkc\") pod \"18fd2519-f36c-4817-85da-7615979c3340\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.534533 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-inventory\") pod \"18fd2519-f36c-4817-85da-7615979c3340\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.534609 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-combined-ca-bundle\") pod \"18fd2519-f36c-4817-85da-7615979c3340\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.534650 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-secret-0\") pod \"18fd2519-f36c-4817-85da-7615979c3340\" (UID: \"18fd2519-f36c-4817-85da-7615979c3340\") " Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.540441 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18fd2519-f36c-4817-85da-7615979c3340-kube-api-access-5mpkc" (OuterVolumeSpecName: "kube-api-access-5mpkc") pod "18fd2519-f36c-4817-85da-7615979c3340" (UID: "18fd2519-f36c-4817-85da-7615979c3340"). InnerVolumeSpecName "kube-api-access-5mpkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.542489 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "18fd2519-f36c-4817-85da-7615979c3340" (UID: "18fd2519-f36c-4817-85da-7615979c3340"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.565712 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "18fd2519-f36c-4817-85da-7615979c3340" (UID: "18fd2519-f36c-4817-85da-7615979c3340"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.569577 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-inventory" (OuterVolumeSpecName: "inventory") pod "18fd2519-f36c-4817-85da-7615979c3340" (UID: "18fd2519-f36c-4817-85da-7615979c3340"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.571176 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18fd2519-f36c-4817-85da-7615979c3340" (UID: "18fd2519-f36c-4817-85da-7615979c3340"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.636770 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.636799 4707 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.636809 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.636817 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mpkc\" (UniqueName: \"kubernetes.io/projected/18fd2519-f36c-4817-85da-7615979c3340-kube-api-access-5mpkc\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:10 crc kubenswrapper[4707]: I1127 16:42:10.636828 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18fd2519-f36c-4817-85da-7615979c3340-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.002308 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" event={"ID":"18fd2519-f36c-4817-85da-7615979c3340","Type":"ContainerDied","Data":"5faafaf85956e94bd183cbd1ce3d76ce3975ea0b41e4d4459786b73dbc8f6d14"} Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.002692 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5faafaf85956e94bd183cbd1ce3d76ce3975ea0b41e4d4459786b73dbc8f6d14" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.002595 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.122893 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6"] Nov 27 16:42:11 crc kubenswrapper[4707]: E1127 16:42:11.123632 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerName="registry-server" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.123730 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerName="registry-server" Nov 27 16:42:11 crc kubenswrapper[4707]: E1127 16:42:11.123831 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18fd2519-f36c-4817-85da-7615979c3340" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.123917 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="18fd2519-f36c-4817-85da-7615979c3340" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 16:42:11 crc kubenswrapper[4707]: E1127 16:42:11.123991 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerName="extract-content" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.124055 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerName="extract-content" Nov 27 16:42:11 crc kubenswrapper[4707]: E1127 16:42:11.124138 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerName="extract-utilities" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.124213 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerName="extract-utilities" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.125019 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" containerName="registry-server" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.125138 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="18fd2519-f36c-4817-85da-7615979c3340" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.127115 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.129534 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.130268 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.130844 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.131288 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.131358 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.131447 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.131783 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.157412 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6"] Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.216477 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41316a9-e4d8-4406-88c2-4cf8883a79af" path="/var/lib/kubelet/pods/a41316a9-e4d8-4406-88c2-4cf8883a79af/volumes" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.250188 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.250253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.250479 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.250553 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.250601 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.250646 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.250884 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.251189 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clj48\" (UniqueName: \"kubernetes.io/projected/c0d7830e-74a5-4ea0-b396-0095a96496be-kube-api-access-clj48\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.251354 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.259182 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.261678 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.353667 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.353720 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.353771 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.353851 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clj48\" (UniqueName: \"kubernetes.io/projected/c0d7830e-74a5-4ea0-b396-0095a96496be-kube-api-access-clj48\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.353902 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.353923 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.353939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.354001 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.354046 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.355131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.357866 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.358058 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.358302 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.358792 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.358812 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.359618 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.361186 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.375342 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clj48\" (UniqueName: \"kubernetes.io/projected/c0d7830e-74a5-4ea0-b396-0095a96496be-kube-api-access-clj48\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lxzx6\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.447047 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:42:11 crc kubenswrapper[4707]: I1127 16:42:11.994926 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6"] Nov 27 16:42:12 crc kubenswrapper[4707]: I1127 16:42:12.018653 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" event={"ID":"c0d7830e-74a5-4ea0-b396-0095a96496be","Type":"ContainerStarted","Data":"0bb487ea7fae4634b0964ceec7420ccaa866dff31c0706700ff43cd3c3af9aee"} Nov 27 16:42:12 crc kubenswrapper[4707]: I1127 16:42:12.305293 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5rctg" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="registry-server" probeResult="failure" output=< Nov 27 16:42:12 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Nov 27 16:42:12 crc kubenswrapper[4707]: > Nov 27 16:42:13 crc kubenswrapper[4707]: I1127 16:42:13.032178 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" event={"ID":"c0d7830e-74a5-4ea0-b396-0095a96496be","Type":"ContainerStarted","Data":"c74ec0bbebd25bf325a2526c2b1caf99029788e212e9bec85a13546872eb64fa"} Nov 27 16:42:13 crc kubenswrapper[4707]: I1127 16:42:13.060123 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" podStartSLOduration=1.436100371 podStartE2EDuration="2.060100054s" podCreationTimestamp="2025-11-27 16:42:11 +0000 UTC" firstStartedPulling="2025-11-27 16:42:12.000345194 +0000 UTC m=+2307.631793972" lastFinishedPulling="2025-11-27 16:42:12.624344887 +0000 UTC m=+2308.255793655" observedRunningTime="2025-11-27 16:42:13.054578777 +0000 UTC m=+2308.686027545" watchObservedRunningTime="2025-11-27 16:42:13.060100054 +0000 UTC m=+2308.691548842" Nov 27 16:42:17 crc kubenswrapper[4707]: I1127 16:42:17.196015 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:42:17 crc kubenswrapper[4707]: E1127 16:42:17.197349 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:42:21 crc kubenswrapper[4707]: I1127 16:42:21.331173 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:21 crc kubenswrapper[4707]: I1127 16:42:21.394846 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:21 crc kubenswrapper[4707]: I1127 16:42:21.577845 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rctg"] Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.142175 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5rctg" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="registry-server" containerID="cri-o://103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8" gracePeriod=2 Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.656607 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.856589 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-utilities\") pod \"6e78461c-1a57-411d-b143-0c4160e11623\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.856939 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsx6k\" (UniqueName: \"kubernetes.io/projected/6e78461c-1a57-411d-b143-0c4160e11623-kube-api-access-dsx6k\") pod \"6e78461c-1a57-411d-b143-0c4160e11623\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.857116 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-catalog-content\") pod \"6e78461c-1a57-411d-b143-0c4160e11623\" (UID: \"6e78461c-1a57-411d-b143-0c4160e11623\") " Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.860108 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-utilities" (OuterVolumeSpecName: "utilities") pod "6e78461c-1a57-411d-b143-0c4160e11623" (UID: "6e78461c-1a57-411d-b143-0c4160e11623"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.867150 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e78461c-1a57-411d-b143-0c4160e11623-kube-api-access-dsx6k" (OuterVolumeSpecName: "kube-api-access-dsx6k") pod "6e78461c-1a57-411d-b143-0c4160e11623" (UID: "6e78461c-1a57-411d-b143-0c4160e11623"). InnerVolumeSpecName "kube-api-access-dsx6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.959110 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.959144 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsx6k\" (UniqueName: \"kubernetes.io/projected/6e78461c-1a57-411d-b143-0c4160e11623-kube-api-access-dsx6k\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:23 crc kubenswrapper[4707]: I1127 16:42:23.989786 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e78461c-1a57-411d-b143-0c4160e11623" (UID: "6e78461c-1a57-411d-b143-0c4160e11623"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.060604 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e78461c-1a57-411d-b143-0c4160e11623-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.153113 4707 generic.go:334] "Generic (PLEG): container finished" podID="6e78461c-1a57-411d-b143-0c4160e11623" containerID="103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8" exitCode=0 Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.153154 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rctg" event={"ID":"6e78461c-1a57-411d-b143-0c4160e11623","Type":"ContainerDied","Data":"103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8"} Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.153180 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rctg" event={"ID":"6e78461c-1a57-411d-b143-0c4160e11623","Type":"ContainerDied","Data":"95bc4dd969d97b9a5569d563e7347c421c5a7ea654d643edc69ce865028ef765"} Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.153208 4707 scope.go:117] "RemoveContainer" containerID="103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.153281 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rctg" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.186798 4707 scope.go:117] "RemoveContainer" containerID="5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.220120 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rctg"] Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.230626 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5rctg"] Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.233634 4707 scope.go:117] "RemoveContainer" containerID="fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.278307 4707 scope.go:117] "RemoveContainer" containerID="103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8" Nov 27 16:42:24 crc kubenswrapper[4707]: E1127 16:42:24.278856 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8\": container with ID starting with 103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8 not found: ID does not exist" containerID="103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.278909 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8"} err="failed to get container status \"103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8\": rpc error: code = NotFound desc = could not find container \"103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8\": container with ID starting with 103b59d915d51ce9cad1944ced7aaadc121b97492f5c840f7ba349c5cf5edae8 not found: ID does not exist" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.278968 4707 scope.go:117] "RemoveContainer" containerID="5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e" Nov 27 16:42:24 crc kubenswrapper[4707]: E1127 16:42:24.279324 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e\": container with ID starting with 5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e not found: ID does not exist" containerID="5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.279408 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e"} err="failed to get container status \"5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e\": rpc error: code = NotFound desc = could not find container \"5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e\": container with ID starting with 5f3ebb17d4735bc05362fa8274e467dd65bd37db3e712244a69c057922ded08e not found: ID does not exist" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.279435 4707 scope.go:117] "RemoveContainer" containerID="fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a" Nov 27 16:42:24 crc kubenswrapper[4707]: E1127 16:42:24.279765 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a\": container with ID starting with fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a not found: ID does not exist" containerID="fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a" Nov 27 16:42:24 crc kubenswrapper[4707]: I1127 16:42:24.279791 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a"} err="failed to get container status \"fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a\": rpc error: code = NotFound desc = could not find container \"fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a\": container with ID starting with fbe3f980ff7942acb0c657a0f233ea4e331c6c3d8903e7b80637af054ea2530a not found: ID does not exist" Nov 27 16:42:25 crc kubenswrapper[4707]: I1127 16:42:25.208532 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e78461c-1a57-411d-b143-0c4160e11623" path="/var/lib/kubelet/pods/6e78461c-1a57-411d-b143-0c4160e11623/volumes" Nov 27 16:42:30 crc kubenswrapper[4707]: I1127 16:42:30.195950 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:42:30 crc kubenswrapper[4707]: E1127 16:42:30.197006 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:42:39 crc kubenswrapper[4707]: I1127 16:42:39.847673 4707 trace.go:236] Trace[981278095]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-flczg" (27-Nov-2025 16:42:38.544) (total time: 1302ms): Nov 27 16:42:39 crc kubenswrapper[4707]: Trace[981278095]: [1.302879206s] [1.302879206s] END Nov 27 16:42:42 crc kubenswrapper[4707]: I1127 16:42:42.196499 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:42:42 crc kubenswrapper[4707]: E1127 16:42:42.197636 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:42:55 crc kubenswrapper[4707]: I1127 16:42:55.200675 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:42:55 crc kubenswrapper[4707]: E1127 16:42:55.201629 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.150275 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95q6g"] Nov 27 16:43:03 crc kubenswrapper[4707]: E1127 16:43:03.151304 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="extract-utilities" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.151319 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="extract-utilities" Nov 27 16:43:03 crc kubenswrapper[4707]: E1127 16:43:03.151354 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="extract-content" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.151362 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="extract-content" Nov 27 16:43:03 crc kubenswrapper[4707]: E1127 16:43:03.151413 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="registry-server" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.151422 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="registry-server" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.151650 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e78461c-1a57-411d-b143-0c4160e11623" containerName="registry-server" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.154601 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.174625 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95q6g"] Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.311118 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-catalog-content\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.311202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62nz\" (UniqueName: \"kubernetes.io/projected/35a1f5c3-13bd-43c3-ae8d-836d78c38936-kube-api-access-l62nz\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.312031 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-utilities\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.414684 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-utilities\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.414852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-catalog-content\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.414901 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l62nz\" (UniqueName: \"kubernetes.io/projected/35a1f5c3-13bd-43c3-ae8d-836d78c38936-kube-api-access-l62nz\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.415360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-utilities\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.415524 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-catalog-content\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.437514 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62nz\" (UniqueName: \"kubernetes.io/projected/35a1f5c3-13bd-43c3-ae8d-836d78c38936-kube-api-access-l62nz\") pod \"community-operators-95q6g\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:03 crc kubenswrapper[4707]: I1127 16:43:03.475610 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:04 crc kubenswrapper[4707]: I1127 16:43:04.070403 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95q6g"] Nov 27 16:43:04 crc kubenswrapper[4707]: I1127 16:43:04.597148 4707 generic.go:334] "Generic (PLEG): container finished" podID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerID="c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2" exitCode=0 Nov 27 16:43:04 crc kubenswrapper[4707]: I1127 16:43:04.597211 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95q6g" event={"ID":"35a1f5c3-13bd-43c3-ae8d-836d78c38936","Type":"ContainerDied","Data":"c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2"} Nov 27 16:43:04 crc kubenswrapper[4707]: I1127 16:43:04.597547 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95q6g" event={"ID":"35a1f5c3-13bd-43c3-ae8d-836d78c38936","Type":"ContainerStarted","Data":"44878a34968465fecb8d0a4298771b75df2e358e768cc0d466185d512c0e7cfd"} Nov 27 16:43:05 crc kubenswrapper[4707]: I1127 16:43:05.607888 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95q6g" event={"ID":"35a1f5c3-13bd-43c3-ae8d-836d78c38936","Type":"ContainerStarted","Data":"42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e"} Nov 27 16:43:08 crc kubenswrapper[4707]: I1127 16:43:08.194709 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:43:08 crc kubenswrapper[4707]: E1127 16:43:08.195572 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:43:08 crc kubenswrapper[4707]: I1127 16:43:08.633628 4707 generic.go:334] "Generic (PLEG): container finished" podID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerID="42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e" exitCode=0 Nov 27 16:43:08 crc kubenswrapper[4707]: I1127 16:43:08.633671 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95q6g" event={"ID":"35a1f5c3-13bd-43c3-ae8d-836d78c38936","Type":"ContainerDied","Data":"42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e"} Nov 27 16:43:09 crc kubenswrapper[4707]: I1127 16:43:09.647505 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95q6g" event={"ID":"35a1f5c3-13bd-43c3-ae8d-836d78c38936","Type":"ContainerStarted","Data":"3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca"} Nov 27 16:43:09 crc kubenswrapper[4707]: I1127 16:43:09.673585 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95q6g" podStartSLOduration=1.943964739 podStartE2EDuration="6.673560115s" podCreationTimestamp="2025-11-27 16:43:03 +0000 UTC" firstStartedPulling="2025-11-27 16:43:04.606200517 +0000 UTC m=+2360.237649295" lastFinishedPulling="2025-11-27 16:43:09.335795863 +0000 UTC m=+2364.967244671" observedRunningTime="2025-11-27 16:43:09.666587674 +0000 UTC m=+2365.298036452" watchObservedRunningTime="2025-11-27 16:43:09.673560115 +0000 UTC m=+2365.305008913" Nov 27 16:43:13 crc kubenswrapper[4707]: I1127 16:43:13.476021 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:13 crc kubenswrapper[4707]: I1127 16:43:13.476803 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:13 crc kubenswrapper[4707]: I1127 16:43:13.543988 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:22 crc kubenswrapper[4707]: I1127 16:43:22.196121 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:43:22 crc kubenswrapper[4707]: E1127 16:43:22.197225 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:43:23 crc kubenswrapper[4707]: I1127 16:43:23.543117 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:23 crc kubenswrapper[4707]: I1127 16:43:23.597673 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95q6g"] Nov 27 16:43:23 crc kubenswrapper[4707]: I1127 16:43:23.773550 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-95q6g" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerName="registry-server" containerID="cri-o://3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca" gracePeriod=2 Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.236114 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.343869 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-utilities\") pod \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.344055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-catalog-content\") pod \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.344129 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l62nz\" (UniqueName: \"kubernetes.io/projected/35a1f5c3-13bd-43c3-ae8d-836d78c38936-kube-api-access-l62nz\") pod \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\" (UID: \"35a1f5c3-13bd-43c3-ae8d-836d78c38936\") " Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.344933 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-utilities" (OuterVolumeSpecName: "utilities") pod "35a1f5c3-13bd-43c3-ae8d-836d78c38936" (UID: "35a1f5c3-13bd-43c3-ae8d-836d78c38936"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.349687 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a1f5c3-13bd-43c3-ae8d-836d78c38936-kube-api-access-l62nz" (OuterVolumeSpecName: "kube-api-access-l62nz") pod "35a1f5c3-13bd-43c3-ae8d-836d78c38936" (UID: "35a1f5c3-13bd-43c3-ae8d-836d78c38936"). InnerVolumeSpecName "kube-api-access-l62nz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.393310 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35a1f5c3-13bd-43c3-ae8d-836d78c38936" (UID: "35a1f5c3-13bd-43c3-ae8d-836d78c38936"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.446123 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.446158 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35a1f5c3-13bd-43c3-ae8d-836d78c38936-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.446174 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l62nz\" (UniqueName: \"kubernetes.io/projected/35a1f5c3-13bd-43c3-ae8d-836d78c38936-kube-api-access-l62nz\") on node \"crc\" DevicePath \"\"" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.783893 4707 generic.go:334] "Generic (PLEG): container finished" podID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerID="3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca" exitCode=0 Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.783931 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95q6g" event={"ID":"35a1f5c3-13bd-43c3-ae8d-836d78c38936","Type":"ContainerDied","Data":"3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca"} Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.783958 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95q6g" event={"ID":"35a1f5c3-13bd-43c3-ae8d-836d78c38936","Type":"ContainerDied","Data":"44878a34968465fecb8d0a4298771b75df2e358e768cc0d466185d512c0e7cfd"} Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.783976 4707 scope.go:117] "RemoveContainer" containerID="3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.783985 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95q6g" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.814539 4707 scope.go:117] "RemoveContainer" containerID="42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.830203 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95q6g"] Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.838019 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-95q6g"] Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.854668 4707 scope.go:117] "RemoveContainer" containerID="c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.887501 4707 scope.go:117] "RemoveContainer" containerID="3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca" Nov 27 16:43:24 crc kubenswrapper[4707]: E1127 16:43:24.888209 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca\": container with ID starting with 3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca not found: ID does not exist" containerID="3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.888252 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca"} err="failed to get container status \"3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca\": rpc error: code = NotFound desc = could not find container \"3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca\": container with ID starting with 3d15ce9276e66c4e01af4aee6f64ea48f7683a73a980686c59b2f2e24a7989ca not found: ID does not exist" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.888280 4707 scope.go:117] "RemoveContainer" containerID="42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e" Nov 27 16:43:24 crc kubenswrapper[4707]: E1127 16:43:24.888683 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e\": container with ID starting with 42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e not found: ID does not exist" containerID="42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.888704 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e"} err="failed to get container status \"42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e\": rpc error: code = NotFound desc = could not find container \"42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e\": container with ID starting with 42abe10cab32d6db651d0929e8b060492b3e87ca9a6eae25e6318f600cf2f62e not found: ID does not exist" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.888717 4707 scope.go:117] "RemoveContainer" containerID="c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2" Nov 27 16:43:24 crc kubenswrapper[4707]: E1127 16:43:24.889021 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2\": container with ID starting with c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2 not found: ID does not exist" containerID="c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2" Nov 27 16:43:24 crc kubenswrapper[4707]: I1127 16:43:24.889051 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2"} err="failed to get container status \"c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2\": rpc error: code = NotFound desc = could not find container \"c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2\": container with ID starting with c30b7372b25cf50e2c0e7e99415f99474aa704ccfe0a8e18c6a7cfa084d93ed2 not found: ID does not exist" Nov 27 16:43:25 crc kubenswrapper[4707]: I1127 16:43:25.208127 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" path="/var/lib/kubelet/pods/35a1f5c3-13bd-43c3-ae8d-836d78c38936/volumes" Nov 27 16:43:34 crc kubenswrapper[4707]: I1127 16:43:34.195707 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:43:34 crc kubenswrapper[4707]: E1127 16:43:34.196625 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:43:49 crc kubenswrapper[4707]: I1127 16:43:49.195735 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:43:49 crc kubenswrapper[4707]: E1127 16:43:49.197743 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:44:01 crc kubenswrapper[4707]: I1127 16:44:01.195138 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:44:01 crc kubenswrapper[4707]: E1127 16:44:01.195944 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:44:12 crc kubenswrapper[4707]: I1127 16:44:12.195536 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:44:12 crc kubenswrapper[4707]: E1127 16:44:12.196490 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:44:25 crc kubenswrapper[4707]: I1127 16:44:25.205613 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:44:25 crc kubenswrapper[4707]: E1127 16:44:25.206863 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:44:36 crc kubenswrapper[4707]: I1127 16:44:36.195823 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:44:36 crc kubenswrapper[4707]: E1127 16:44:36.196806 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:44:50 crc kubenswrapper[4707]: I1127 16:44:50.195925 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:44:50 crc kubenswrapper[4707]: E1127 16:44:50.197271 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.176204 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb"] Nov 27 16:45:00 crc kubenswrapper[4707]: E1127 16:45:00.177239 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerName="extract-utilities" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.177255 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerName="extract-utilities" Nov 27 16:45:00 crc kubenswrapper[4707]: E1127 16:45:00.177286 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerName="registry-server" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.177296 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerName="registry-server" Nov 27 16:45:00 crc kubenswrapper[4707]: E1127 16:45:00.177323 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerName="extract-content" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.177332 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerName="extract-content" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.177617 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a1f5c3-13bd-43c3-ae8d-836d78c38936" containerName="registry-server" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.178436 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.182120 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.182688 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.186403 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb"] Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.375831 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9bda09e-8eef-4e16-841f-ec75f01a33b8-config-volume\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.375888 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqq4\" (UniqueName: \"kubernetes.io/projected/a9bda09e-8eef-4e16-841f-ec75f01a33b8-kube-api-access-mjqq4\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.376402 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9bda09e-8eef-4e16-841f-ec75f01a33b8-secret-volume\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.478337 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9bda09e-8eef-4e16-841f-ec75f01a33b8-secret-volume\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.478494 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9bda09e-8eef-4e16-841f-ec75f01a33b8-config-volume\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.478523 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqq4\" (UniqueName: \"kubernetes.io/projected/a9bda09e-8eef-4e16-841f-ec75f01a33b8-kube-api-access-mjqq4\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.479885 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9bda09e-8eef-4e16-841f-ec75f01a33b8-config-volume\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.490054 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9bda09e-8eef-4e16-841f-ec75f01a33b8-secret-volume\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.494785 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqq4\" (UniqueName: \"kubernetes.io/projected/a9bda09e-8eef-4e16-841f-ec75f01a33b8-kube-api-access-mjqq4\") pod \"collect-profiles-29404365-6mrrb\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:00 crc kubenswrapper[4707]: I1127 16:45:00.531715 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:01 crc kubenswrapper[4707]: I1127 16:45:01.028297 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb"] Nov 27 16:45:01 crc kubenswrapper[4707]: I1127 16:45:01.812234 4707 generic.go:334] "Generic (PLEG): container finished" podID="a9bda09e-8eef-4e16-841f-ec75f01a33b8" containerID="47c1d02b5d5644cc95c6c2f32d2e9a390b5048baa7d2a9e5c6eda42de2676728" exitCode=0 Nov 27 16:45:01 crc kubenswrapper[4707]: I1127 16:45:01.812283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" event={"ID":"a9bda09e-8eef-4e16-841f-ec75f01a33b8","Type":"ContainerDied","Data":"47c1d02b5d5644cc95c6c2f32d2e9a390b5048baa7d2a9e5c6eda42de2676728"} Nov 27 16:45:01 crc kubenswrapper[4707]: I1127 16:45:01.812585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" event={"ID":"a9bda09e-8eef-4e16-841f-ec75f01a33b8","Type":"ContainerStarted","Data":"dafd659cc9883b8197b9780bbb0503e0c3b7d5c8c6fcc3cb755f8d9a4489bcdc"} Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.150966 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.196048 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:45:03 crc kubenswrapper[4707]: E1127 16:45:03.196588 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.341076 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqq4\" (UniqueName: \"kubernetes.io/projected/a9bda09e-8eef-4e16-841f-ec75f01a33b8-kube-api-access-mjqq4\") pod \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.341234 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9bda09e-8eef-4e16-841f-ec75f01a33b8-config-volume\") pod \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.341280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9bda09e-8eef-4e16-841f-ec75f01a33b8-secret-volume\") pod \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\" (UID: \"a9bda09e-8eef-4e16-841f-ec75f01a33b8\") " Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.342274 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9bda09e-8eef-4e16-841f-ec75f01a33b8-config-volume" (OuterVolumeSpecName: "config-volume") pod "a9bda09e-8eef-4e16-841f-ec75f01a33b8" (UID: "a9bda09e-8eef-4e16-841f-ec75f01a33b8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.348875 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9bda09e-8eef-4e16-841f-ec75f01a33b8-kube-api-access-mjqq4" (OuterVolumeSpecName: "kube-api-access-mjqq4") pod "a9bda09e-8eef-4e16-841f-ec75f01a33b8" (UID: "a9bda09e-8eef-4e16-841f-ec75f01a33b8"). InnerVolumeSpecName "kube-api-access-mjqq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.349736 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9bda09e-8eef-4e16-841f-ec75f01a33b8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a9bda09e-8eef-4e16-841f-ec75f01a33b8" (UID: "a9bda09e-8eef-4e16-841f-ec75f01a33b8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.443223 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqq4\" (UniqueName: \"kubernetes.io/projected/a9bda09e-8eef-4e16-841f-ec75f01a33b8-kube-api-access-mjqq4\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.443254 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9bda09e-8eef-4e16-841f-ec75f01a33b8-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.443263 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a9bda09e-8eef-4e16-841f-ec75f01a33b8-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.835768 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" event={"ID":"a9bda09e-8eef-4e16-841f-ec75f01a33b8","Type":"ContainerDied","Data":"dafd659cc9883b8197b9780bbb0503e0c3b7d5c8c6fcc3cb755f8d9a4489bcdc"} Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.835821 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404365-6mrrb" Nov 27 16:45:03 crc kubenswrapper[4707]: I1127 16:45:03.835829 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dafd659cc9883b8197b9780bbb0503e0c3b7d5c8c6fcc3cb755f8d9a4489bcdc" Nov 27 16:45:04 crc kubenswrapper[4707]: I1127 16:45:04.263123 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55"] Nov 27 16:45:04 crc kubenswrapper[4707]: I1127 16:45:04.272613 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404320-sxj55"] Nov 27 16:45:05 crc kubenswrapper[4707]: I1127 16:45:05.235819 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f55dc6de-bb5d-4221-a670-4b65c3992031" path="/var/lib/kubelet/pods/f55dc6de-bb5d-4221-a670-4b65c3992031/volumes" Nov 27 16:45:14 crc kubenswrapper[4707]: I1127 16:45:14.046057 4707 scope.go:117] "RemoveContainer" containerID="6f9e1859fb3d4cc72392a9151b239b734bac5c17488cfb399958b7f1f94a7520" Nov 27 16:45:16 crc kubenswrapper[4707]: I1127 16:45:16.195506 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:45:16 crc kubenswrapper[4707]: E1127 16:45:16.196228 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:45:18 crc kubenswrapper[4707]: I1127 16:45:18.001721 4707 generic.go:334] "Generic (PLEG): container finished" podID="c0d7830e-74a5-4ea0-b396-0095a96496be" containerID="c74ec0bbebd25bf325a2526c2b1caf99029788e212e9bec85a13546872eb64fa" exitCode=0 Nov 27 16:45:18 crc kubenswrapper[4707]: I1127 16:45:18.002409 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" event={"ID":"c0d7830e-74a5-4ea0-b396-0095a96496be","Type":"ContainerDied","Data":"c74ec0bbebd25bf325a2526c2b1caf99029788e212e9bec85a13546872eb64fa"} Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.535804 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-1\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720366 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-combined-ca-bundle\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720410 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clj48\" (UniqueName: \"kubernetes.io/projected/c0d7830e-74a5-4ea0-b396-0095a96496be-kube-api-access-clj48\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-0\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-extra-config-0\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720627 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-1\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720651 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-inventory\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720678 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-ssh-key\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.720694 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-0\") pod \"c0d7830e-74a5-4ea0-b396-0095a96496be\" (UID: \"c0d7830e-74a5-4ea0-b396-0095a96496be\") " Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.726539 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d7830e-74a5-4ea0-b396-0095a96496be-kube-api-access-clj48" (OuterVolumeSpecName: "kube-api-access-clj48") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "kube-api-access-clj48". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.745888 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.749514 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.751963 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.755051 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.760130 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-inventory" (OuterVolumeSpecName: "inventory") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.773427 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.785264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.790559 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c0d7830e-74a5-4ea0-b396-0095a96496be" (UID: "c0d7830e-74a5-4ea0-b396-0095a96496be"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823187 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823232 4707 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823245 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clj48\" (UniqueName: \"kubernetes.io/projected/c0d7830e-74a5-4ea0-b396-0095a96496be-kube-api-access-clj48\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823259 4707 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823273 4707 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823285 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823297 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823307 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:19 crc kubenswrapper[4707]: I1127 16:45:19.823317 4707 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c0d7830e-74a5-4ea0-b396-0095a96496be-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.035666 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" event={"ID":"c0d7830e-74a5-4ea0-b396-0095a96496be","Type":"ContainerDied","Data":"0bb487ea7fae4634b0964ceec7420ccaa866dff31c0706700ff43cd3c3af9aee"} Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.035717 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb487ea7fae4634b0964ceec7420ccaa866dff31c0706700ff43cd3c3af9aee" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.035769 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lxzx6" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.143936 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z"] Nov 27 16:45:20 crc kubenswrapper[4707]: E1127 16:45:20.144856 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9bda09e-8eef-4e16-841f-ec75f01a33b8" containerName="collect-profiles" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.144983 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9bda09e-8eef-4e16-841f-ec75f01a33b8" containerName="collect-profiles" Nov 27 16:45:20 crc kubenswrapper[4707]: E1127 16:45:20.145065 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d7830e-74a5-4ea0-b396-0095a96496be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.145141 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d7830e-74a5-4ea0-b396-0095a96496be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.145463 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9bda09e-8eef-4e16-841f-ec75f01a33b8" containerName="collect-profiles" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.145620 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d7830e-74a5-4ea0-b396-0095a96496be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.146703 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.155405 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.155432 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.155573 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.155421 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.156827 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-h79s2" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.157159 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z"] Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.334005 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.334103 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.334153 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.334173 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.334346 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bdxw\" (UniqueName: \"kubernetes.io/projected/03d3491e-8e8f-49a2-8552-f939d87bbb59-kube-api-access-5bdxw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.334390 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.334425 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.437014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bdxw\" (UniqueName: \"kubernetes.io/projected/03d3491e-8e8f-49a2-8552-f939d87bbb59-kube-api-access-5bdxw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.437085 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.437151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.437249 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.437304 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.437354 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.437393 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.441970 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.443014 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.443118 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.443165 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.444274 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.445456 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.458686 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bdxw\" (UniqueName: \"kubernetes.io/projected/03d3491e-8e8f-49a2-8552-f939d87bbb59-kube-api-access-5bdxw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pk25z\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:20 crc kubenswrapper[4707]: I1127 16:45:20.466919 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:45:21 crc kubenswrapper[4707]: I1127 16:45:21.138439 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z"] Nov 27 16:45:22 crc kubenswrapper[4707]: I1127 16:45:22.062006 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" event={"ID":"03d3491e-8e8f-49a2-8552-f939d87bbb59","Type":"ContainerStarted","Data":"f6ac355ffce8a9b6ceb1ccc2659a94604f9591796c693f2f6b0a7081a7a18165"} Nov 27 16:45:22 crc kubenswrapper[4707]: I1127 16:45:22.062772 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" event={"ID":"03d3491e-8e8f-49a2-8552-f939d87bbb59","Type":"ContainerStarted","Data":"073013a6fe63d289a658f35af0d260b93604af2b04e06e813123134ef2a1798f"} Nov 27 16:45:22 crc kubenswrapper[4707]: I1127 16:45:22.081228 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" podStartSLOduration=1.442764436 podStartE2EDuration="2.08120822s" podCreationTimestamp="2025-11-27 16:45:20 +0000 UTC" firstStartedPulling="2025-11-27 16:45:21.151620647 +0000 UTC m=+2496.783069425" lastFinishedPulling="2025-11-27 16:45:21.790064431 +0000 UTC m=+2497.421513209" observedRunningTime="2025-11-27 16:45:22.080771589 +0000 UTC m=+2497.712220357" watchObservedRunningTime="2025-11-27 16:45:22.08120822 +0000 UTC m=+2497.712656998" Nov 27 16:45:29 crc kubenswrapper[4707]: I1127 16:45:29.195445 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:45:29 crc kubenswrapper[4707]: E1127 16:45:29.196360 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:45:41 crc kubenswrapper[4707]: I1127 16:45:41.196019 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:45:41 crc kubenswrapper[4707]: E1127 16:45:41.197434 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:45:56 crc kubenswrapper[4707]: I1127 16:45:56.197090 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:45:56 crc kubenswrapper[4707]: E1127 16:45:56.197921 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:46:07 crc kubenswrapper[4707]: I1127 16:46:07.195582 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:46:07 crc kubenswrapper[4707]: E1127 16:46:07.196352 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:46:20 crc kubenswrapper[4707]: I1127 16:46:20.195959 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:46:20 crc kubenswrapper[4707]: E1127 16:46:20.196927 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:46:35 crc kubenswrapper[4707]: I1127 16:46:35.201435 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:46:35 crc kubenswrapper[4707]: E1127 16:46:35.202915 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:46:48 crc kubenswrapper[4707]: I1127 16:46:48.195579 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:46:48 crc kubenswrapper[4707]: E1127 16:46:48.196473 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:47:01 crc kubenswrapper[4707]: I1127 16:47:01.195833 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:47:01 crc kubenswrapper[4707]: E1127 16:47:01.197414 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:47:15 crc kubenswrapper[4707]: I1127 16:47:15.203543 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:47:16 crc kubenswrapper[4707]: I1127 16:47:16.203682 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"e81264f6a6a3db0f1a2727a11a97e72a9ebad7ddc0f52e59880a37155bb6a287"} Nov 27 16:47:59 crc kubenswrapper[4707]: I1127 16:47:59.653602 4707 generic.go:334] "Generic (PLEG): container finished" podID="03d3491e-8e8f-49a2-8552-f939d87bbb59" containerID="f6ac355ffce8a9b6ceb1ccc2659a94604f9591796c693f2f6b0a7081a7a18165" exitCode=0 Nov 27 16:47:59 crc kubenswrapper[4707]: I1127 16:47:59.653773 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" event={"ID":"03d3491e-8e8f-49a2-8552-f939d87bbb59","Type":"ContainerDied","Data":"f6ac355ffce8a9b6ceb1ccc2659a94604f9591796c693f2f6b0a7081a7a18165"} Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.126869 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.212466 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-0\") pod \"03d3491e-8e8f-49a2-8552-f939d87bbb59\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.212545 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-inventory\") pod \"03d3491e-8e8f-49a2-8552-f939d87bbb59\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.212598 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-1\") pod \"03d3491e-8e8f-49a2-8552-f939d87bbb59\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.212755 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ssh-key\") pod \"03d3491e-8e8f-49a2-8552-f939d87bbb59\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.212817 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-telemetry-combined-ca-bundle\") pod \"03d3491e-8e8f-49a2-8552-f939d87bbb59\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.212847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bdxw\" (UniqueName: \"kubernetes.io/projected/03d3491e-8e8f-49a2-8552-f939d87bbb59-kube-api-access-5bdxw\") pod \"03d3491e-8e8f-49a2-8552-f939d87bbb59\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.212898 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-2\") pod \"03d3491e-8e8f-49a2-8552-f939d87bbb59\" (UID: \"03d3491e-8e8f-49a2-8552-f939d87bbb59\") " Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.218525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "03d3491e-8e8f-49a2-8552-f939d87bbb59" (UID: "03d3491e-8e8f-49a2-8552-f939d87bbb59"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.218525 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d3491e-8e8f-49a2-8552-f939d87bbb59-kube-api-access-5bdxw" (OuterVolumeSpecName: "kube-api-access-5bdxw") pod "03d3491e-8e8f-49a2-8552-f939d87bbb59" (UID: "03d3491e-8e8f-49a2-8552-f939d87bbb59"). InnerVolumeSpecName "kube-api-access-5bdxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.238264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "03d3491e-8e8f-49a2-8552-f939d87bbb59" (UID: "03d3491e-8e8f-49a2-8552-f939d87bbb59"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.242879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "03d3491e-8e8f-49a2-8552-f939d87bbb59" (UID: "03d3491e-8e8f-49a2-8552-f939d87bbb59"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.242974 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-inventory" (OuterVolumeSpecName: "inventory") pod "03d3491e-8e8f-49a2-8552-f939d87bbb59" (UID: "03d3491e-8e8f-49a2-8552-f939d87bbb59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.244924 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "03d3491e-8e8f-49a2-8552-f939d87bbb59" (UID: "03d3491e-8e8f-49a2-8552-f939d87bbb59"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.256689 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "03d3491e-8e8f-49a2-8552-f939d87bbb59" (UID: "03d3491e-8e8f-49a2-8552-f939d87bbb59"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.315991 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.316015 4707 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.316026 4707 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.316036 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bdxw\" (UniqueName: \"kubernetes.io/projected/03d3491e-8e8f-49a2-8552-f939d87bbb59-kube-api-access-5bdxw\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.316044 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.316053 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.316063 4707 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/03d3491e-8e8f-49a2-8552-f939d87bbb59-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.676824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" event={"ID":"03d3491e-8e8f-49a2-8552-f939d87bbb59","Type":"ContainerDied","Data":"073013a6fe63d289a658f35af0d260b93604af2b04e06e813123134ef2a1798f"} Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.676874 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="073013a6fe63d289a658f35af0d260b93604af2b04e06e813123134ef2a1798f" Nov 27 16:48:01 crc kubenswrapper[4707]: I1127 16:48:01.676946 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pk25z" Nov 27 16:49:33 crc kubenswrapper[4707]: I1127 16:49:33.623581 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:49:33 crc kubenswrapper[4707]: I1127 16:49:33.624306 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:50:03 crc kubenswrapper[4707]: I1127 16:50:03.623658 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:50:03 crc kubenswrapper[4707]: I1127 16:50:03.624172 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:50:04 crc kubenswrapper[4707]: I1127 16:50:04.851191 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wgz6q"] Nov 27 16:50:04 crc kubenswrapper[4707]: E1127 16:50:04.851708 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d3491e-8e8f-49a2-8552-f939d87bbb59" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 16:50:04 crc kubenswrapper[4707]: I1127 16:50:04.851724 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d3491e-8e8f-49a2-8552-f939d87bbb59" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 16:50:04 crc kubenswrapper[4707]: I1127 16:50:04.851993 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d3491e-8e8f-49a2-8552-f939d87bbb59" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 16:50:04 crc kubenswrapper[4707]: I1127 16:50:04.854009 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:04 crc kubenswrapper[4707]: I1127 16:50:04.865051 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wgz6q"] Nov 27 16:50:04 crc kubenswrapper[4707]: I1127 16:50:04.939145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/9837eddb-3276-42b0-a378-2344edcd80d6-kube-api-access-87gv4\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:04 crc kubenswrapper[4707]: I1127 16:50:04.939221 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-catalog-content\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:04 crc kubenswrapper[4707]: I1127 16:50:04.939539 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-utilities\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:05 crc kubenswrapper[4707]: I1127 16:50:05.041556 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/9837eddb-3276-42b0-a378-2344edcd80d6-kube-api-access-87gv4\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:05 crc kubenswrapper[4707]: I1127 16:50:05.041619 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-catalog-content\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:05 crc kubenswrapper[4707]: I1127 16:50:05.041754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-utilities\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:05 crc kubenswrapper[4707]: I1127 16:50:05.042360 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-utilities\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:05 crc kubenswrapper[4707]: I1127 16:50:05.042475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-catalog-content\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:05 crc kubenswrapper[4707]: I1127 16:50:05.071481 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/9837eddb-3276-42b0-a378-2344edcd80d6-kube-api-access-87gv4\") pod \"certified-operators-wgz6q\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:05 crc kubenswrapper[4707]: I1127 16:50:05.210977 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:05 crc kubenswrapper[4707]: I1127 16:50:05.683830 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wgz6q"] Nov 27 16:50:06 crc kubenswrapper[4707]: I1127 16:50:05.997268 4707 generic.go:334] "Generic (PLEG): container finished" podID="9837eddb-3276-42b0-a378-2344edcd80d6" containerID="54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06" exitCode=0 Nov 27 16:50:06 crc kubenswrapper[4707]: I1127 16:50:05.997335 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgz6q" event={"ID":"9837eddb-3276-42b0-a378-2344edcd80d6","Type":"ContainerDied","Data":"54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06"} Nov 27 16:50:06 crc kubenswrapper[4707]: I1127 16:50:05.997728 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgz6q" event={"ID":"9837eddb-3276-42b0-a378-2344edcd80d6","Type":"ContainerStarted","Data":"e5f1f69ee7ced0eef1a6e9abe5a14405abef8397ea28a22b43c7887c5c101a7d"} Nov 27 16:50:06 crc kubenswrapper[4707]: I1127 16:50:06.004205 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:50:08 crc kubenswrapper[4707]: I1127 16:50:08.024497 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgz6q" event={"ID":"9837eddb-3276-42b0-a378-2344edcd80d6","Type":"ContainerStarted","Data":"78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc"} Nov 27 16:50:09 crc kubenswrapper[4707]: I1127 16:50:09.038767 4707 generic.go:334] "Generic (PLEG): container finished" podID="9837eddb-3276-42b0-a378-2344edcd80d6" containerID="78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc" exitCode=0 Nov 27 16:50:09 crc kubenswrapper[4707]: I1127 16:50:09.040444 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgz6q" event={"ID":"9837eddb-3276-42b0-a378-2344edcd80d6","Type":"ContainerDied","Data":"78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc"} Nov 27 16:50:11 crc kubenswrapper[4707]: I1127 16:50:11.072565 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgz6q" event={"ID":"9837eddb-3276-42b0-a378-2344edcd80d6","Type":"ContainerStarted","Data":"f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72"} Nov 27 16:50:11 crc kubenswrapper[4707]: I1127 16:50:11.101165 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wgz6q" podStartSLOduration=3.122657012 podStartE2EDuration="7.101147046s" podCreationTimestamp="2025-11-27 16:50:04 +0000 UTC" firstStartedPulling="2025-11-27 16:50:06.003706602 +0000 UTC m=+2781.635155410" lastFinishedPulling="2025-11-27 16:50:09.982196646 +0000 UTC m=+2785.613645444" observedRunningTime="2025-11-27 16:50:11.095416485 +0000 UTC m=+2786.726865253" watchObservedRunningTime="2025-11-27 16:50:11.101147046 +0000 UTC m=+2786.732595814" Nov 27 16:50:15 crc kubenswrapper[4707]: I1127 16:50:15.219948 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:15 crc kubenswrapper[4707]: I1127 16:50:15.220723 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:15 crc kubenswrapper[4707]: I1127 16:50:15.277546 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:16 crc kubenswrapper[4707]: I1127 16:50:16.188801 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:16 crc kubenswrapper[4707]: I1127 16:50:16.245766 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wgz6q"] Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.158116 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wgz6q" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" containerName="registry-server" containerID="cri-o://f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72" gracePeriod=2 Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.655108 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.768812 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-catalog-content\") pod \"9837eddb-3276-42b0-a378-2344edcd80d6\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.768934 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-utilities\") pod \"9837eddb-3276-42b0-a378-2344edcd80d6\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.769001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/9837eddb-3276-42b0-a378-2344edcd80d6-kube-api-access-87gv4\") pod \"9837eddb-3276-42b0-a378-2344edcd80d6\" (UID: \"9837eddb-3276-42b0-a378-2344edcd80d6\") " Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.769784 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-utilities" (OuterVolumeSpecName: "utilities") pod "9837eddb-3276-42b0-a378-2344edcd80d6" (UID: "9837eddb-3276-42b0-a378-2344edcd80d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.779397 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9837eddb-3276-42b0-a378-2344edcd80d6-kube-api-access-87gv4" (OuterVolumeSpecName: "kube-api-access-87gv4") pod "9837eddb-3276-42b0-a378-2344edcd80d6" (UID: "9837eddb-3276-42b0-a378-2344edcd80d6"). InnerVolumeSpecName "kube-api-access-87gv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.818756 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9837eddb-3276-42b0-a378-2344edcd80d6" (UID: "9837eddb-3276-42b0-a378-2344edcd80d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.871387 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.871422 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9837eddb-3276-42b0-a378-2344edcd80d6-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:50:18 crc kubenswrapper[4707]: I1127 16:50:18.871433 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/9837eddb-3276-42b0-a378-2344edcd80d6-kube-api-access-87gv4\") on node \"crc\" DevicePath \"\"" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.172751 4707 generic.go:334] "Generic (PLEG): container finished" podID="9837eddb-3276-42b0-a378-2344edcd80d6" containerID="f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72" exitCode=0 Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.172817 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgz6q" event={"ID":"9837eddb-3276-42b0-a378-2344edcd80d6","Type":"ContainerDied","Data":"f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72"} Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.172839 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wgz6q" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.172864 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgz6q" event={"ID":"9837eddb-3276-42b0-a378-2344edcd80d6","Type":"ContainerDied","Data":"e5f1f69ee7ced0eef1a6e9abe5a14405abef8397ea28a22b43c7887c5c101a7d"} Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.172894 4707 scope.go:117] "RemoveContainer" containerID="f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.217065 4707 scope.go:117] "RemoveContainer" containerID="78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.240667 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wgz6q"] Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.258805 4707 scope.go:117] "RemoveContainer" containerID="54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.259756 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wgz6q"] Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.318473 4707 scope.go:117] "RemoveContainer" containerID="f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72" Nov 27 16:50:19 crc kubenswrapper[4707]: E1127 16:50:19.319059 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72\": container with ID starting with f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72 not found: ID does not exist" containerID="f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.319112 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72"} err="failed to get container status \"f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72\": rpc error: code = NotFound desc = could not find container \"f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72\": container with ID starting with f6dc1c627b52c3d60961facac6db77f12b7f67e4102ada5d28c32ff0fc92ce72 not found: ID does not exist" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.319145 4707 scope.go:117] "RemoveContainer" containerID="78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc" Nov 27 16:50:19 crc kubenswrapper[4707]: E1127 16:50:19.319587 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc\": container with ID starting with 78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc not found: ID does not exist" containerID="78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.319742 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc"} err="failed to get container status \"78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc\": rpc error: code = NotFound desc = could not find container \"78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc\": container with ID starting with 78a0e833aee5737839c9b5aac48a586974c04214c832d6810f59a4601f1eaefc not found: ID does not exist" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.319872 4707 scope.go:117] "RemoveContainer" containerID="54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06" Nov 27 16:50:19 crc kubenswrapper[4707]: E1127 16:50:19.320684 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06\": container with ID starting with 54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06 not found: ID does not exist" containerID="54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06" Nov 27 16:50:19 crc kubenswrapper[4707]: I1127 16:50:19.320726 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06"} err="failed to get container status \"54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06\": rpc error: code = NotFound desc = could not find container \"54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06\": container with ID starting with 54603fe32cae960a4669a55066b174ba6972f5a4a05a985d72d6c6fb44814c06 not found: ID does not exist" Nov 27 16:50:21 crc kubenswrapper[4707]: I1127 16:50:21.209290 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" path="/var/lib/kubelet/pods/9837eddb-3276-42b0-a378-2344edcd80d6/volumes" Nov 27 16:50:33 crc kubenswrapper[4707]: I1127 16:50:33.623715 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:50:33 crc kubenswrapper[4707]: I1127 16:50:33.626070 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:50:33 crc kubenswrapper[4707]: I1127 16:50:33.626301 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:50:33 crc kubenswrapper[4707]: I1127 16:50:33.627682 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e81264f6a6a3db0f1a2727a11a97e72a9ebad7ddc0f52e59880a37155bb6a287"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:50:33 crc kubenswrapper[4707]: I1127 16:50:33.628005 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://e81264f6a6a3db0f1a2727a11a97e72a9ebad7ddc0f52e59880a37155bb6a287" gracePeriod=600 Nov 27 16:50:34 crc kubenswrapper[4707]: I1127 16:50:34.336120 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="e81264f6a6a3db0f1a2727a11a97e72a9ebad7ddc0f52e59880a37155bb6a287" exitCode=0 Nov 27 16:50:34 crc kubenswrapper[4707]: I1127 16:50:34.336225 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"e81264f6a6a3db0f1a2727a11a97e72a9ebad7ddc0f52e59880a37155bb6a287"} Nov 27 16:50:34 crc kubenswrapper[4707]: I1127 16:50:34.337002 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084"} Nov 27 16:50:34 crc kubenswrapper[4707]: I1127 16:50:34.337035 4707 scope.go:117] "RemoveContainer" containerID="f26ac9df9340ea2ea514252ce6b8bc1868c8afd678b5bdcc749e5a78be5c9f11" Nov 27 16:50:55 crc kubenswrapper[4707]: I1127 16:50:55.119498 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/manager/0.log" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.909864 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.910475 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b281d412-34bb-4169-8c35-65318084ab97" containerName="openstackclient" containerID="cri-o://0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082" gracePeriod=2 Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.924648 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.953396 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 27 16:50:56 crc kubenswrapper[4707]: E1127 16:50:56.954112 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" containerName="registry-server" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.954207 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" containerName="registry-server" Nov 27 16:50:56 crc kubenswrapper[4707]: E1127 16:50:56.954321 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" containerName="extract-utilities" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.954432 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" containerName="extract-utilities" Nov 27 16:50:56 crc kubenswrapper[4707]: E1127 16:50:56.954513 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" containerName="extract-content" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.954582 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" containerName="extract-content" Nov 27 16:50:56 crc kubenswrapper[4707]: E1127 16:50:56.954660 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b281d412-34bb-4169-8c35-65318084ab97" containerName="openstackclient" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.954719 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b281d412-34bb-4169-8c35-65318084ab97" containerName="openstackclient" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.954939 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b281d412-34bb-4169-8c35-65318084ab97" containerName="openstackclient" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.956049 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="9837eddb-3276-42b0-a378-2344edcd80d6" containerName="registry-server" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.956780 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.966794 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 16:50:56 crc kubenswrapper[4707]: I1127 16:50:56.982440 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b281d412-34bb-4169-8c35-65318084ab97" podUID="f46a6b5b-2709-4fe9-8db4-cb2df241d728" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.093842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config-secret\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.093896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.093925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2c7r\" (UniqueName: \"kubernetes.io/projected/f46a6b5b-2709-4fe9-8db4-cb2df241d728-kube-api-access-s2c7r\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.093959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.196014 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config-secret\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.196066 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.196094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2c7r\" (UniqueName: \"kubernetes.io/projected/f46a6b5b-2709-4fe9-8db4-cb2df241d728-kube-api-access-s2c7r\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.196138 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.198542 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.204587 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config-secret\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.204713 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.219717 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2c7r\" (UniqueName: \"kubernetes.io/projected/f46a6b5b-2709-4fe9-8db4-cb2df241d728-kube-api-access-s2c7r\") pod \"openstackclient\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.287043 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:50:57 crc kubenswrapper[4707]: I1127 16:50:57.830908 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.159694 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-wdw97"] Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.161260 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-wdw97" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.186976 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-wdw97"] Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.265546 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-e24b-account-create-update-8kjqj"] Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.266782 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.268743 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.276202 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-e24b-account-create-update-8kjqj"] Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.314041 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9krv\" (UniqueName: \"kubernetes.io/projected/4c7c7c12-1162-4e24-a569-46f941361752-kube-api-access-r9krv\") pod \"aodh-db-create-wdw97\" (UID: \"4c7c7c12-1162-4e24-a569-46f941361752\") " pod="openstack/aodh-db-create-wdw97" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.314410 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c7c7c12-1162-4e24-a569-46f941361752-operator-scripts\") pod \"aodh-db-create-wdw97\" (UID: \"4c7c7c12-1162-4e24-a569-46f941361752\") " pod="openstack/aodh-db-create-wdw97" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.417737 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c7c7c12-1162-4e24-a569-46f941361752-operator-scripts\") pod \"aodh-db-create-wdw97\" (UID: \"4c7c7c12-1162-4e24-a569-46f941361752\") " pod="openstack/aodh-db-create-wdw97" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.417852 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9krv\" (UniqueName: \"kubernetes.io/projected/4c7c7c12-1162-4e24-a569-46f941361752-kube-api-access-r9krv\") pod \"aodh-db-create-wdw97\" (UID: \"4c7c7c12-1162-4e24-a569-46f941361752\") " pod="openstack/aodh-db-create-wdw97" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.417911 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-operator-scripts\") pod \"aodh-e24b-account-create-update-8kjqj\" (UID: \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\") " pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.418059 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f75t\" (UniqueName: \"kubernetes.io/projected/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-kube-api-access-7f75t\") pod \"aodh-e24b-account-create-update-8kjqj\" (UID: \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\") " pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.418910 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c7c7c12-1162-4e24-a569-46f941361752-operator-scripts\") pod \"aodh-db-create-wdw97\" (UID: \"4c7c7c12-1162-4e24-a569-46f941361752\") " pod="openstack/aodh-db-create-wdw97" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.437786 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9krv\" (UniqueName: \"kubernetes.io/projected/4c7c7c12-1162-4e24-a569-46f941361752-kube-api-access-r9krv\") pod \"aodh-db-create-wdw97\" (UID: \"4c7c7c12-1162-4e24-a569-46f941361752\") " pod="openstack/aodh-db-create-wdw97" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.485671 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-wdw97" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.519996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f75t\" (UniqueName: \"kubernetes.io/projected/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-kube-api-access-7f75t\") pod \"aodh-e24b-account-create-update-8kjqj\" (UID: \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\") " pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.520151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-operator-scripts\") pod \"aodh-e24b-account-create-update-8kjqj\" (UID: \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\") " pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.521389 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-operator-scripts\") pod \"aodh-e24b-account-create-update-8kjqj\" (UID: \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\") " pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.548198 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f75t\" (UniqueName: \"kubernetes.io/projected/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-kube-api-access-7f75t\") pod \"aodh-e24b-account-create-update-8kjqj\" (UID: \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\") " pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.587026 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.593801 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f46a6b5b-2709-4fe9-8db4-cb2df241d728","Type":"ContainerStarted","Data":"a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954"} Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.593847 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f46a6b5b-2709-4fe9-8db4-cb2df241d728","Type":"ContainerStarted","Data":"a7db41c3c1a29a9f0c91c6e2bdb8009d78427b79ad48eca5110c24745e05d9d3"} Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.624519 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.624502678 podStartE2EDuration="2.624502678s" podCreationTimestamp="2025-11-27 16:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:50:58.611039018 +0000 UTC m=+2834.242487806" watchObservedRunningTime="2025-11-27 16:50:58.624502678 +0000 UTC m=+2834.255951446" Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.864766 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-e24b-account-create-update-8kjqj"] Nov 27 16:50:58 crc kubenswrapper[4707]: W1127 16:50:58.869253 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ff61a68_9ac0_4fe7_8290_5346e39a3e3c.slice/crio-9ea92ab4b90df70436d7387c6db3bc7c1e05e6ef54a5dfc645799c3aaab6e00f WatchSource:0}: Error finding container 9ea92ab4b90df70436d7387c6db3bc7c1e05e6ef54a5dfc645799c3aaab6e00f: Status 404 returned error can't find the container with id 9ea92ab4b90df70436d7387c6db3bc7c1e05e6ef54a5dfc645799c3aaab6e00f Nov 27 16:50:58 crc kubenswrapper[4707]: I1127 16:50:58.946486 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-wdw97"] Nov 27 16:50:59 crc kubenswrapper[4707]: W1127 16:50:59.028614 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c7c7c12_1162_4e24_a569_46f941361752.slice/crio-99252eeb35e30ccada5b101ca056af647036a6e8a16daabf418c787c28db5806 WatchSource:0}: Error finding container 99252eeb35e30ccada5b101ca056af647036a6e8a16daabf418c787c28db5806: Status 404 returned error can't find the container with id 99252eeb35e30ccada5b101ca056af647036a6e8a16daabf418c787c28db5806 Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.179331 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.338228 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-openstack-config-secret\") pod \"b281d412-34bb-4169-8c35-65318084ab97\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.338474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-combined-ca-bundle\") pod \"b281d412-34bb-4169-8c35-65318084ab97\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.338585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvr5w\" (UniqueName: \"kubernetes.io/projected/b281d412-34bb-4169-8c35-65318084ab97-kube-api-access-gvr5w\") pod \"b281d412-34bb-4169-8c35-65318084ab97\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.338612 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b281d412-34bb-4169-8c35-65318084ab97-openstack-config\") pod \"b281d412-34bb-4169-8c35-65318084ab97\" (UID: \"b281d412-34bb-4169-8c35-65318084ab97\") " Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.349766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b281d412-34bb-4169-8c35-65318084ab97-kube-api-access-gvr5w" (OuterVolumeSpecName: "kube-api-access-gvr5w") pod "b281d412-34bb-4169-8c35-65318084ab97" (UID: "b281d412-34bb-4169-8c35-65318084ab97"). InnerVolumeSpecName "kube-api-access-gvr5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.370571 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b281d412-34bb-4169-8c35-65318084ab97-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b281d412-34bb-4169-8c35-65318084ab97" (UID: "b281d412-34bb-4169-8c35-65318084ab97"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.392812 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b281d412-34bb-4169-8c35-65318084ab97" (UID: "b281d412-34bb-4169-8c35-65318084ab97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.432575 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b281d412-34bb-4169-8c35-65318084ab97" (UID: "b281d412-34bb-4169-8c35-65318084ab97"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.441872 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.441908 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvr5w\" (UniqueName: \"kubernetes.io/projected/b281d412-34bb-4169-8c35-65318084ab97-kube-api-access-gvr5w\") on node \"crc\" DevicePath \"\"" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.441924 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b281d412-34bb-4169-8c35-65318084ab97-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.441935 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b281d412-34bb-4169-8c35-65318084ab97-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.605510 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e24b-account-create-update-8kjqj" event={"ID":"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c","Type":"ContainerStarted","Data":"0109580f59af85211a67e2385a53d07d7f03e717125345c1a3e72843603fe0ac"} Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.605573 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e24b-account-create-update-8kjqj" event={"ID":"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c","Type":"ContainerStarted","Data":"9ea92ab4b90df70436d7387c6db3bc7c1e05e6ef54a5dfc645799c3aaab6e00f"} Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.608640 4707 generic.go:334] "Generic (PLEG): container finished" podID="b281d412-34bb-4169-8c35-65318084ab97" containerID="0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082" exitCode=137 Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.608698 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.608720 4707 scope.go:117] "RemoveContainer" containerID="0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.611222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-wdw97" event={"ID":"4c7c7c12-1162-4e24-a569-46f941361752","Type":"ContainerStarted","Data":"11a7cbd23cb81f2810b908b1e1022edd78a9e3202bb1c260d61c6ad8248edc16"} Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.611289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-wdw97" event={"ID":"4c7c7c12-1162-4e24-a569-46f941361752","Type":"ContainerStarted","Data":"99252eeb35e30ccada5b101ca056af647036a6e8a16daabf418c787c28db5806"} Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.634622 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-e24b-account-create-update-8kjqj" podStartSLOduration=1.63460544 podStartE2EDuration="1.63460544s" podCreationTimestamp="2025-11-27 16:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:50:59.621021517 +0000 UTC m=+2835.252470305" watchObservedRunningTime="2025-11-27 16:50:59.63460544 +0000 UTC m=+2835.266054208" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.639266 4707 scope.go:117] "RemoveContainer" containerID="0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082" Nov 27 16:50:59 crc kubenswrapper[4707]: E1127 16:50:59.639938 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082\": container with ID starting with 0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082 not found: ID does not exist" containerID="0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.639965 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082"} err="failed to get container status \"0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082\": rpc error: code = NotFound desc = could not find container \"0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082\": container with ID starting with 0c2c4dbbd082a4f51e11901ae546de62c1a591d4c9a34a65db9dd0fef2ec8082 not found: ID does not exist" Nov 27 16:50:59 crc kubenswrapper[4707]: I1127 16:50:59.644503 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-wdw97" podStartSLOduration=1.644489393 podStartE2EDuration="1.644489393s" podCreationTimestamp="2025-11-27 16:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:50:59.638510316 +0000 UTC m=+2835.269959084" watchObservedRunningTime="2025-11-27 16:50:59.644489393 +0000 UTC m=+2835.275938151" Nov 27 16:51:00 crc kubenswrapper[4707]: I1127 16:51:00.620905 4707 generic.go:334] "Generic (PLEG): container finished" podID="4c7c7c12-1162-4e24-a569-46f941361752" containerID="11a7cbd23cb81f2810b908b1e1022edd78a9e3202bb1c260d61c6ad8248edc16" exitCode=0 Nov 27 16:51:00 crc kubenswrapper[4707]: I1127 16:51:00.621013 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-wdw97" event={"ID":"4c7c7c12-1162-4e24-a569-46f941361752","Type":"ContainerDied","Data":"11a7cbd23cb81f2810b908b1e1022edd78a9e3202bb1c260d61c6ad8248edc16"} Nov 27 16:51:00 crc kubenswrapper[4707]: I1127 16:51:00.631012 4707 generic.go:334] "Generic (PLEG): container finished" podID="4ff61a68-9ac0-4fe7-8290-5346e39a3e3c" containerID="0109580f59af85211a67e2385a53d07d7f03e717125345c1a3e72843603fe0ac" exitCode=0 Nov 27 16:51:00 crc kubenswrapper[4707]: I1127 16:51:00.631135 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e24b-account-create-update-8kjqj" event={"ID":"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c","Type":"ContainerDied","Data":"0109580f59af85211a67e2385a53d07d7f03e717125345c1a3e72843603fe0ac"} Nov 27 16:51:01 crc kubenswrapper[4707]: I1127 16:51:01.208934 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b281d412-34bb-4169-8c35-65318084ab97" path="/var/lib/kubelet/pods/b281d412-34bb-4169-8c35-65318084ab97/volumes" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.130869 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-wdw97" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.139073 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.295349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-operator-scripts\") pod \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\" (UID: \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\") " Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.295821 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f75t\" (UniqueName: \"kubernetes.io/projected/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-kube-api-access-7f75t\") pod \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\" (UID: \"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c\") " Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.295998 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c7c7c12-1162-4e24-a569-46f941361752-operator-scripts\") pod \"4c7c7c12-1162-4e24-a569-46f941361752\" (UID: \"4c7c7c12-1162-4e24-a569-46f941361752\") " Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.296252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9krv\" (UniqueName: \"kubernetes.io/projected/4c7c7c12-1162-4e24-a569-46f941361752-kube-api-access-r9krv\") pod \"4c7c7c12-1162-4e24-a569-46f941361752\" (UID: \"4c7c7c12-1162-4e24-a569-46f941361752\") " Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.296505 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ff61a68-9ac0-4fe7-8290-5346e39a3e3c" (UID: "4ff61a68-9ac0-4fe7-8290-5346e39a3e3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.296650 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7c7c12-1162-4e24-a569-46f941361752-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c7c7c12-1162-4e24-a569-46f941361752" (UID: "4c7c7c12-1162-4e24-a569-46f941361752"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.297794 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c7c7c12-1162-4e24-a569-46f941361752-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.297831 4707 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.301106 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7c7c12-1162-4e24-a569-46f941361752-kube-api-access-r9krv" (OuterVolumeSpecName: "kube-api-access-r9krv") pod "4c7c7c12-1162-4e24-a569-46f941361752" (UID: "4c7c7c12-1162-4e24-a569-46f941361752"). InnerVolumeSpecName "kube-api-access-r9krv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.302694 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-kube-api-access-7f75t" (OuterVolumeSpecName: "kube-api-access-7f75t") pod "4ff61a68-9ac0-4fe7-8290-5346e39a3e3c" (UID: "4ff61a68-9ac0-4fe7-8290-5346e39a3e3c"). InnerVolumeSpecName "kube-api-access-7f75t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.399272 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f75t\" (UniqueName: \"kubernetes.io/projected/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c-kube-api-access-7f75t\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.399327 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9krv\" (UniqueName: \"kubernetes.io/projected/4c7c7c12-1162-4e24-a569-46f941361752-kube-api-access-r9krv\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.660897 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-wdw97" event={"ID":"4c7c7c12-1162-4e24-a569-46f941361752","Type":"ContainerDied","Data":"99252eeb35e30ccada5b101ca056af647036a6e8a16daabf418c787c28db5806"} Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.660959 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99252eeb35e30ccada5b101ca056af647036a6e8a16daabf418c787c28db5806" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.661056 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-wdw97" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.665561 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-e24b-account-create-update-8kjqj" event={"ID":"4ff61a68-9ac0-4fe7-8290-5346e39a3e3c","Type":"ContainerDied","Data":"9ea92ab4b90df70436d7387c6db3bc7c1e05e6ef54a5dfc645799c3aaab6e00f"} Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.665837 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea92ab4b90df70436d7387c6db3bc7c1e05e6ef54a5dfc645799c3aaab6e00f" Nov 27 16:51:02 crc kubenswrapper[4707]: I1127 16:51:02.665635 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-e24b-account-create-update-8kjqj" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.505872 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-b5jww"] Nov 27 16:51:03 crc kubenswrapper[4707]: E1127 16:51:03.506531 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff61a68-9ac0-4fe7-8290-5346e39a3e3c" containerName="mariadb-account-create-update" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.506543 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff61a68-9ac0-4fe7-8290-5346e39a3e3c" containerName="mariadb-account-create-update" Nov 27 16:51:03 crc kubenswrapper[4707]: E1127 16:51:03.506556 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7c7c12-1162-4e24-a569-46f941361752" containerName="mariadb-database-create" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.506562 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7c7c12-1162-4e24-a569-46f941361752" containerName="mariadb-database-create" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.506760 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7c7c12-1162-4e24-a569-46f941361752" containerName="mariadb-database-create" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.506774 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff61a68-9ac0-4fe7-8290-5346e39a3e3c" containerName="mariadb-account-create-update" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.507357 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.509587 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.510246 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.510458 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.510710 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pfvsr" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.522280 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-b5jww"] Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.539444 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-combined-ca-bundle\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.539534 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-config-data\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.539564 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vc2w\" (UniqueName: \"kubernetes.io/projected/838f87e6-51f0-4745-98b6-89ecd7255df4-kube-api-access-9vc2w\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.539613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-scripts\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.640834 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-scripts\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.640947 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-combined-ca-bundle\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.641029 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-config-data\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.641054 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vc2w\" (UniqueName: \"kubernetes.io/projected/838f87e6-51f0-4745-98b6-89ecd7255df4-kube-api-access-9vc2w\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.647300 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-scripts\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.647443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-combined-ca-bundle\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.647664 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-config-data\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.657443 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vc2w\" (UniqueName: \"kubernetes.io/projected/838f87e6-51f0-4745-98b6-89ecd7255df4-kube-api-access-9vc2w\") pod \"aodh-db-sync-b5jww\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:03 crc kubenswrapper[4707]: I1127 16:51:03.823719 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:04 crc kubenswrapper[4707]: I1127 16:51:04.297540 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-b5jww"] Nov 27 16:51:04 crc kubenswrapper[4707]: I1127 16:51:04.683721 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-b5jww" event={"ID":"838f87e6-51f0-4745-98b6-89ecd7255df4","Type":"ContainerStarted","Data":"8adef4fee116748aa85d331c705e58f7b74b88e92cfa0e7d50ddadc6d377a1f9"} Nov 27 16:51:14 crc kubenswrapper[4707]: I1127 16:51:14.798432 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-b5jww" event={"ID":"838f87e6-51f0-4745-98b6-89ecd7255df4","Type":"ContainerStarted","Data":"1a4d6a4fc61e4ce8c025eb63ad9d6f049ed60531a3fffa24c3753a2eccd8baf7"} Nov 27 16:51:14 crc kubenswrapper[4707]: I1127 16:51:14.843418 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-b5jww" podStartSLOduration=2.336653959 podStartE2EDuration="11.843357027s" podCreationTimestamp="2025-11-27 16:51:03 +0000 UTC" firstStartedPulling="2025-11-27 16:51:04.300477986 +0000 UTC m=+2839.931926754" lastFinishedPulling="2025-11-27 16:51:13.807181014 +0000 UTC m=+2849.438629822" observedRunningTime="2025-11-27 16:51:14.826942734 +0000 UTC m=+2850.458391552" watchObservedRunningTime="2025-11-27 16:51:14.843357027 +0000 UTC m=+2850.474805835" Nov 27 16:51:17 crc kubenswrapper[4707]: I1127 16:51:17.835202 4707 generic.go:334] "Generic (PLEG): container finished" podID="838f87e6-51f0-4745-98b6-89ecd7255df4" containerID="1a4d6a4fc61e4ce8c025eb63ad9d6f049ed60531a3fffa24c3753a2eccd8baf7" exitCode=0 Nov 27 16:51:17 crc kubenswrapper[4707]: I1127 16:51:17.835314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-b5jww" event={"ID":"838f87e6-51f0-4745-98b6-89ecd7255df4","Type":"ContainerDied","Data":"1a4d6a4fc61e4ce8c025eb63ad9d6f049ed60531a3fffa24c3753a2eccd8baf7"} Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.181444 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.290217 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-config-data\") pod \"838f87e6-51f0-4745-98b6-89ecd7255df4\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.290565 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-combined-ca-bundle\") pod \"838f87e6-51f0-4745-98b6-89ecd7255df4\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.290680 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-scripts\") pod \"838f87e6-51f0-4745-98b6-89ecd7255df4\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.290735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vc2w\" (UniqueName: \"kubernetes.io/projected/838f87e6-51f0-4745-98b6-89ecd7255df4-kube-api-access-9vc2w\") pod \"838f87e6-51f0-4745-98b6-89ecd7255df4\" (UID: \"838f87e6-51f0-4745-98b6-89ecd7255df4\") " Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.297486 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-scripts" (OuterVolumeSpecName: "scripts") pod "838f87e6-51f0-4745-98b6-89ecd7255df4" (UID: "838f87e6-51f0-4745-98b6-89ecd7255df4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.297537 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/838f87e6-51f0-4745-98b6-89ecd7255df4-kube-api-access-9vc2w" (OuterVolumeSpecName: "kube-api-access-9vc2w") pod "838f87e6-51f0-4745-98b6-89ecd7255df4" (UID: "838f87e6-51f0-4745-98b6-89ecd7255df4"). InnerVolumeSpecName "kube-api-access-9vc2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.319809 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "838f87e6-51f0-4745-98b6-89ecd7255df4" (UID: "838f87e6-51f0-4745-98b6-89ecd7255df4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.320403 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-config-data" (OuterVolumeSpecName: "config-data") pod "838f87e6-51f0-4745-98b6-89ecd7255df4" (UID: "838f87e6-51f0-4745-98b6-89ecd7255df4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.394228 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.394255 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vc2w\" (UniqueName: \"kubernetes.io/projected/838f87e6-51f0-4745-98b6-89ecd7255df4-kube-api-access-9vc2w\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.394265 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.394276 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838f87e6-51f0-4745-98b6-89ecd7255df4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.860453 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-b5jww" Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.860360 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-b5jww" event={"ID":"838f87e6-51f0-4745-98b6-89ecd7255df4","Type":"ContainerDied","Data":"8adef4fee116748aa85d331c705e58f7b74b88e92cfa0e7d50ddadc6d377a1f9"} Nov 27 16:51:19 crc kubenswrapper[4707]: I1127 16:51:19.862668 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8adef4fee116748aa85d331c705e58f7b74b88e92cfa0e7d50ddadc6d377a1f9" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.262875 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 27 16:51:23 crc kubenswrapper[4707]: E1127 16:51:23.263767 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="838f87e6-51f0-4745-98b6-89ecd7255df4" containerName="aodh-db-sync" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.263785 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="838f87e6-51f0-4745-98b6-89ecd7255df4" containerName="aodh-db-sync" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.264070 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="838f87e6-51f0-4745-98b6-89ecd7255df4" containerName="aodh-db-sync" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.266196 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.268484 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pfvsr" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.269344 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.269366 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.277466 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.376768 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnbp5\" (UniqueName: \"kubernetes.io/projected/f5cbab88-2254-4d29-ae0a-d2d571ab8775-kube-api-access-fnbp5\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.378770 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.379100 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-scripts\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.379329 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-config-data\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.488187 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnbp5\" (UniqueName: \"kubernetes.io/projected/f5cbab88-2254-4d29-ae0a-d2d571ab8775-kube-api-access-fnbp5\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.488257 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.488331 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-scripts\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.488413 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-config-data\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.506236 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.510221 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-config-data\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.516779 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-scripts\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.539725 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnbp5\" (UniqueName: \"kubernetes.io/projected/f5cbab88-2254-4d29-ae0a-d2d571ab8775-kube-api-access-fnbp5\") pod \"aodh-0\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " pod="openstack/aodh-0" Nov 27 16:51:23 crc kubenswrapper[4707]: I1127 16:51:23.602893 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:51:24 crc kubenswrapper[4707]: I1127 16:51:24.064011 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 16:51:24 crc kubenswrapper[4707]: I1127 16:51:24.934345 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerStarted","Data":"0783ce7462a69f16da3b016c363f85366c689fa465dc8176bda2d80c4c4b1efc"} Nov 27 16:51:25 crc kubenswrapper[4707]: I1127 16:51:25.882540 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:51:25 crc kubenswrapper[4707]: I1127 16:51:25.883774 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="sg-core" containerID="cri-o://e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be" gracePeriod=30 Nov 27 16:51:25 crc kubenswrapper[4707]: I1127 16:51:25.883768 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="proxy-httpd" containerID="cri-o://9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa" gracePeriod=30 Nov 27 16:51:25 crc kubenswrapper[4707]: I1127 16:51:25.884028 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="ceilometer-notification-agent" containerID="cri-o://f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd" gracePeriod=30 Nov 27 16:51:25 crc kubenswrapper[4707]: I1127 16:51:25.884493 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="ceilometer-central-agent" containerID="cri-o://77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154" gracePeriod=30 Nov 27 16:51:25 crc kubenswrapper[4707]: I1127 16:51:25.992022 4707 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.208:3000/\": dial tcp 10.217.0.208:3000: connect: connection refused" Nov 27 16:51:26 crc kubenswrapper[4707]: I1127 16:51:26.133073 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 27 16:51:26 crc kubenswrapper[4707]: I1127 16:51:26.954283 4707 generic.go:334] "Generic (PLEG): container finished" podID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerID="9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa" exitCode=0 Nov 27 16:51:26 crc kubenswrapper[4707]: I1127 16:51:26.954600 4707 generic.go:334] "Generic (PLEG): container finished" podID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerID="e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be" exitCode=2 Nov 27 16:51:26 crc kubenswrapper[4707]: I1127 16:51:26.954613 4707 generic.go:334] "Generic (PLEG): container finished" podID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerID="77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154" exitCode=0 Nov 27 16:51:26 crc kubenswrapper[4707]: I1127 16:51:26.954336 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerDied","Data":"9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa"} Nov 27 16:51:26 crc kubenswrapper[4707]: I1127 16:51:26.954735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerDied","Data":"e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be"} Nov 27 16:51:26 crc kubenswrapper[4707]: I1127 16:51:26.954751 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerDied","Data":"77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154"} Nov 27 16:51:26 crc kubenswrapper[4707]: I1127 16:51:26.956994 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerStarted","Data":"f3a532bb25490d5ef26ed2e5735d30ae2fb5ff616571fb5be61dbc8d8f25f370"} Nov 27 16:51:29 crc kubenswrapper[4707]: I1127 16:51:29.995418 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerStarted","Data":"d4e8b4f74b5106a5a95ab08fa4749e09c3806c29b650640f4d1907725152311a"} Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.450004 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.654321 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-run-httpd\") pod \"2aeefda1-503b-4869-b318-d9ebfa19337c\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.654362 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-scripts\") pod \"2aeefda1-503b-4869-b318-d9ebfa19337c\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.654436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-ceilometer-tls-certs\") pod \"2aeefda1-503b-4869-b318-d9ebfa19337c\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.654499 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-config-data\") pod \"2aeefda1-503b-4869-b318-d9ebfa19337c\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.654522 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-log-httpd\") pod \"2aeefda1-503b-4869-b318-d9ebfa19337c\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.654673 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-combined-ca-bundle\") pod \"2aeefda1-503b-4869-b318-d9ebfa19337c\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.654729 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-sg-core-conf-yaml\") pod \"2aeefda1-503b-4869-b318-d9ebfa19337c\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.654817 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbgm5\" (UniqueName: \"kubernetes.io/projected/2aeefda1-503b-4869-b318-d9ebfa19337c-kube-api-access-cbgm5\") pod \"2aeefda1-503b-4869-b318-d9ebfa19337c\" (UID: \"2aeefda1-503b-4869-b318-d9ebfa19337c\") " Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.656028 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2aeefda1-503b-4869-b318-d9ebfa19337c" (UID: "2aeefda1-503b-4869-b318-d9ebfa19337c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.656646 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2aeefda1-503b-4869-b318-d9ebfa19337c" (UID: "2aeefda1-503b-4869-b318-d9ebfa19337c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.661641 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aeefda1-503b-4869-b318-d9ebfa19337c-kube-api-access-cbgm5" (OuterVolumeSpecName: "kube-api-access-cbgm5") pod "2aeefda1-503b-4869-b318-d9ebfa19337c" (UID: "2aeefda1-503b-4869-b318-d9ebfa19337c"). InnerVolumeSpecName "kube-api-access-cbgm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.662380 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-scripts" (OuterVolumeSpecName: "scripts") pod "2aeefda1-503b-4869-b318-d9ebfa19337c" (UID: "2aeefda1-503b-4869-b318-d9ebfa19337c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.697800 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2aeefda1-503b-4869-b318-d9ebfa19337c" (UID: "2aeefda1-503b-4869-b318-d9ebfa19337c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.717623 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2aeefda1-503b-4869-b318-d9ebfa19337c" (UID: "2aeefda1-503b-4869-b318-d9ebfa19337c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.741598 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2aeefda1-503b-4869-b318-d9ebfa19337c" (UID: "2aeefda1-503b-4869-b318-d9ebfa19337c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.750570 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-config-data" (OuterVolumeSpecName: "config-data") pod "2aeefda1-503b-4869-b318-d9ebfa19337c" (UID: "2aeefda1-503b-4869-b318-d9ebfa19337c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.756827 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.756861 4707 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.756872 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbgm5\" (UniqueName: \"kubernetes.io/projected/2aeefda1-503b-4869-b318-d9ebfa19337c-kube-api-access-cbgm5\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.756882 4707 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.756890 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.756899 4707 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.756906 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aeefda1-503b-4869-b318-d9ebfa19337c-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:30 crc kubenswrapper[4707]: I1127 16:51:30.756915 4707 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2aeefda1-503b-4869-b318-d9ebfa19337c-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.008704 4707 generic.go:334] "Generic (PLEG): container finished" podID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerID="f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd" exitCode=0 Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.008755 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.008756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerDied","Data":"f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd"} Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.008997 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2aeefda1-503b-4869-b318-d9ebfa19337c","Type":"ContainerDied","Data":"15c526768386dd29c83ab3149e0a28089add685e19268ec22077d2a4408a93db"} Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.009027 4707 scope.go:117] "RemoveContainer" containerID="9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.047472 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.058477 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.068999 4707 scope.go:117] "RemoveContainer" containerID="e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.078485 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:51:31 crc kubenswrapper[4707]: E1127 16:51:31.078849 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="proxy-httpd" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.078861 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="proxy-httpd" Nov 27 16:51:31 crc kubenswrapper[4707]: E1127 16:51:31.078884 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="ceilometer-notification-agent" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.078889 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="ceilometer-notification-agent" Nov 27 16:51:31 crc kubenswrapper[4707]: E1127 16:51:31.078922 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="ceilometer-central-agent" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.078928 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="ceilometer-central-agent" Nov 27 16:51:31 crc kubenswrapper[4707]: E1127 16:51:31.078939 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="sg-core" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.078945 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="sg-core" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.079123 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="ceilometer-central-agent" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.079136 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="ceilometer-notification-agent" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.079148 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="sg-core" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.079160 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" containerName="proxy-httpd" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.081480 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.084137 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.087199 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.087494 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.108330 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.163771 4707 scope.go:117] "RemoveContainer" containerID="f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.186307 4707 scope.go:117] "RemoveContainer" containerID="77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.222160 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aeefda1-503b-4869-b318-d9ebfa19337c" path="/var/lib/kubelet/pods/2aeefda1-503b-4869-b318-d9ebfa19337c/volumes" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.237573 4707 scope.go:117] "RemoveContainer" containerID="9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa" Nov 27 16:51:31 crc kubenswrapper[4707]: E1127 16:51:31.241603 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa\": container with ID starting with 9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa not found: ID does not exist" containerID="9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.241650 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa"} err="failed to get container status \"9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa\": rpc error: code = NotFound desc = could not find container \"9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa\": container with ID starting with 9b5acc17c26a2a462eeddcc607f92057f253b00834a450bd007e5b0734d85aaa not found: ID does not exist" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.241686 4707 scope.go:117] "RemoveContainer" containerID="e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be" Nov 27 16:51:31 crc kubenswrapper[4707]: E1127 16:51:31.242170 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be\": container with ID starting with e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be not found: ID does not exist" containerID="e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.242204 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be"} err="failed to get container status \"e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be\": rpc error: code = NotFound desc = could not find container \"e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be\": container with ID starting with e646cb3f50035b3bf7cd4cc72abf9fd43e6a670a8582bc28dcda596f78c004be not found: ID does not exist" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.242228 4707 scope.go:117] "RemoveContainer" containerID="f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd" Nov 27 16:51:31 crc kubenswrapper[4707]: E1127 16:51:31.243637 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd\": container with ID starting with f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd not found: ID does not exist" containerID="f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.243700 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd"} err="failed to get container status \"f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd\": rpc error: code = NotFound desc = could not find container \"f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd\": container with ID starting with f2a5ab4e33ddfc181ee61c8113d3f272b54fac6fc72c241dfc474eb8dadd32fd not found: ID does not exist" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.243733 4707 scope.go:117] "RemoveContainer" containerID="77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154" Nov 27 16:51:31 crc kubenswrapper[4707]: E1127 16:51:31.247795 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154\": container with ID starting with 77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154 not found: ID does not exist" containerID="77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.247831 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154"} err="failed to get container status \"77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154\": rpc error: code = NotFound desc = could not find container \"77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154\": container with ID starting with 77b44c5281ceaa5647f057bece49a5964f448db50aa579d4bc93f2b822ce8154 not found: ID does not exist" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.266843 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.268343 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.268584 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-config-data\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.269654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.270092 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-log-httpd\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.271106 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-scripts\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.271896 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-run-httpd\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.272319 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzc9k\" (UniqueName: \"kubernetes.io/projected/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-kube-api-access-hzc9k\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.373708 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-scripts\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.374364 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-run-httpd\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.374529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzc9k\" (UniqueName: \"kubernetes.io/projected/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-kube-api-access-hzc9k\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.374705 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.374813 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.374927 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-config-data\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.375094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.375670 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-log-httpd\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.376282 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-log-httpd\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.378275 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.378428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.378794 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-scripts\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.380742 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-run-httpd\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.381131 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.381271 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-config-data\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.394204 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzc9k\" (UniqueName: \"kubernetes.io/projected/907095f4-0cd3-4e69-8f3f-fa908be6b6d0-kube-api-access-hzc9k\") pod \"ceilometer-0\" (UID: \"907095f4-0cd3-4e69-8f3f-fa908be6b6d0\") " pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.406964 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 16:51:31 crc kubenswrapper[4707]: I1127 16:51:31.921313 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 16:51:31 crc kubenswrapper[4707]: W1127 16:51:31.925328 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod907095f4_0cd3_4e69_8f3f_fa908be6b6d0.slice/crio-fe86c2cf21f1120b7fe7b495b5b7e30a3a1fb4c372b925d9e950d2264fd5225a WatchSource:0}: Error finding container fe86c2cf21f1120b7fe7b495b5b7e30a3a1fb4c372b925d9e950d2264fd5225a: Status 404 returned error can't find the container with id fe86c2cf21f1120b7fe7b495b5b7e30a3a1fb4c372b925d9e950d2264fd5225a Nov 27 16:51:32 crc kubenswrapper[4707]: I1127 16:51:32.042449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907095f4-0cd3-4e69-8f3f-fa908be6b6d0","Type":"ContainerStarted","Data":"fe86c2cf21f1120b7fe7b495b5b7e30a3a1fb4c372b925d9e950d2264fd5225a"} Nov 27 16:51:32 crc kubenswrapper[4707]: I1127 16:51:32.047108 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerStarted","Data":"d4d79985e6118838111d37e1e6feeca5dada1047bb4fa6fc8c107484a90a9261"} Nov 27 16:51:35 crc kubenswrapper[4707]: I1127 16:51:35.093164 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerStarted","Data":"5d46d3186add95945a86001ca2955bbb087b06c49580ff5ec1fb7cedd67d519f"} Nov 27 16:51:35 crc kubenswrapper[4707]: I1127 16:51:35.093443 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-evaluator" containerID="cri-o://d4e8b4f74b5106a5a95ab08fa4749e09c3806c29b650640f4d1907725152311a" gracePeriod=30 Nov 27 16:51:35 crc kubenswrapper[4707]: I1127 16:51:35.093458 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-notifier" containerID="cri-o://d4d79985e6118838111d37e1e6feeca5dada1047bb4fa6fc8c107484a90a9261" gracePeriod=30 Nov 27 16:51:35 crc kubenswrapper[4707]: I1127 16:51:35.093443 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-api" containerID="cri-o://f3a532bb25490d5ef26ed2e5735d30ae2fb5ff616571fb5be61dbc8d8f25f370" gracePeriod=30 Nov 27 16:51:35 crc kubenswrapper[4707]: I1127 16:51:35.093544 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-listener" containerID="cri-o://5d46d3186add95945a86001ca2955bbb087b06c49580ff5ec1fb7cedd67d519f" gracePeriod=30 Nov 27 16:51:35 crc kubenswrapper[4707]: I1127 16:51:35.095449 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907095f4-0cd3-4e69-8f3f-fa908be6b6d0","Type":"ContainerStarted","Data":"fb50ac02a8b380009c5e3b50ff720051c0c7847ab2309d160c575816222cfc16"} Nov 27 16:51:35 crc kubenswrapper[4707]: I1127 16:51:35.124829 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.245649852 podStartE2EDuration="12.124811102s" podCreationTimestamp="2025-11-27 16:51:23 +0000 UTC" firstStartedPulling="2025-11-27 16:51:24.069830266 +0000 UTC m=+2859.701279034" lastFinishedPulling="2025-11-27 16:51:33.948991516 +0000 UTC m=+2869.580440284" observedRunningTime="2025-11-27 16:51:35.121031849 +0000 UTC m=+2870.752480627" watchObservedRunningTime="2025-11-27 16:51:35.124811102 +0000 UTC m=+2870.756259880" Nov 27 16:51:36 crc kubenswrapper[4707]: I1127 16:51:36.106238 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907095f4-0cd3-4e69-8f3f-fa908be6b6d0","Type":"ContainerStarted","Data":"70b1c9aa13d04144d6703ec45d5223dcb107354b03124e348850f28b95bfdeb3"} Nov 27 16:51:36 crc kubenswrapper[4707]: I1127 16:51:36.109380 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerID="d4e8b4f74b5106a5a95ab08fa4749e09c3806c29b650640f4d1907725152311a" exitCode=0 Nov 27 16:51:36 crc kubenswrapper[4707]: I1127 16:51:36.109416 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerID="f3a532bb25490d5ef26ed2e5735d30ae2fb5ff616571fb5be61dbc8d8f25f370" exitCode=0 Nov 27 16:51:36 crc kubenswrapper[4707]: I1127 16:51:36.109440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerDied","Data":"d4e8b4f74b5106a5a95ab08fa4749e09c3806c29b650640f4d1907725152311a"} Nov 27 16:51:36 crc kubenswrapper[4707]: I1127 16:51:36.109469 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerDied","Data":"f3a532bb25490d5ef26ed2e5735d30ae2fb5ff616571fb5be61dbc8d8f25f370"} Nov 27 16:51:37 crc kubenswrapper[4707]: I1127 16:51:37.120115 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907095f4-0cd3-4e69-8f3f-fa908be6b6d0","Type":"ContainerStarted","Data":"cd5bc57cf4e88bdd63f5232b68f90f24d92ab7efd494526822737b4d3abe3440"} Nov 27 16:51:40 crc kubenswrapper[4707]: I1127 16:51:40.163295 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"907095f4-0cd3-4e69-8f3f-fa908be6b6d0","Type":"ContainerStarted","Data":"db96b6241320913f76efcd0808c177810012795af2f56b11baf1dfb6aa5b36b5"} Nov 27 16:51:40 crc kubenswrapper[4707]: I1127 16:51:40.183893 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.042704148 podStartE2EDuration="9.183865325s" podCreationTimestamp="2025-11-27 16:51:31 +0000 UTC" firstStartedPulling="2025-11-27 16:51:31.930341168 +0000 UTC m=+2867.561789936" lastFinishedPulling="2025-11-27 16:51:39.071502335 +0000 UTC m=+2874.702951113" observedRunningTime="2025-11-27 16:51:40.179420146 +0000 UTC m=+2875.810868934" watchObservedRunningTime="2025-11-27 16:51:40.183865325 +0000 UTC m=+2875.815314093" Nov 27 16:51:41 crc kubenswrapper[4707]: I1127 16:51:41.173972 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 16:52:01 crc kubenswrapper[4707]: I1127 16:52:01.423065 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 27 16:52:05 crc kubenswrapper[4707]: I1127 16:52:05.448943 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerID="5d46d3186add95945a86001ca2955bbb087b06c49580ff5ec1fb7cedd67d519f" exitCode=137 Nov 27 16:52:05 crc kubenswrapper[4707]: I1127 16:52:05.449661 4707 generic.go:334] "Generic (PLEG): container finished" podID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerID="d4d79985e6118838111d37e1e6feeca5dada1047bb4fa6fc8c107484a90a9261" exitCode=137 Nov 27 16:52:05 crc kubenswrapper[4707]: I1127 16:52:05.449698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerDied","Data":"5d46d3186add95945a86001ca2955bbb087b06c49580ff5ec1fb7cedd67d519f"} Nov 27 16:52:05 crc kubenswrapper[4707]: I1127 16:52:05.449735 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerDied","Data":"d4d79985e6118838111d37e1e6feeca5dada1047bb4fa6fc8c107484a90a9261"} Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.319905 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.416349 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-scripts\") pod \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.416534 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnbp5\" (UniqueName: \"kubernetes.io/projected/f5cbab88-2254-4d29-ae0a-d2d571ab8775-kube-api-access-fnbp5\") pod \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.416635 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-config-data\") pod \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.416662 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-combined-ca-bundle\") pod \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\" (UID: \"f5cbab88-2254-4d29-ae0a-d2d571ab8775\") " Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.424554 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-scripts" (OuterVolumeSpecName: "scripts") pod "f5cbab88-2254-4d29-ae0a-d2d571ab8775" (UID: "f5cbab88-2254-4d29-ae0a-d2d571ab8775"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.424626 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5cbab88-2254-4d29-ae0a-d2d571ab8775-kube-api-access-fnbp5" (OuterVolumeSpecName: "kube-api-access-fnbp5") pod "f5cbab88-2254-4d29-ae0a-d2d571ab8775" (UID: "f5cbab88-2254-4d29-ae0a-d2d571ab8775"). InnerVolumeSpecName "kube-api-access-fnbp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.464090 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f5cbab88-2254-4d29-ae0a-d2d571ab8775","Type":"ContainerDied","Data":"0783ce7462a69f16da3b016c363f85366c689fa465dc8176bda2d80c4c4b1efc"} Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.464137 4707 scope.go:117] "RemoveContainer" containerID="5d46d3186add95945a86001ca2955bbb087b06c49580ff5ec1fb7cedd67d519f" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.465294 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.518571 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.518836 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnbp5\" (UniqueName: \"kubernetes.io/projected/f5cbab88-2254-4d29-ae0a-d2d571ab8775-kube-api-access-fnbp5\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.519203 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5cbab88-2254-4d29-ae0a-d2d571ab8775" (UID: "f5cbab88-2254-4d29-ae0a-d2d571ab8775"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.527804 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-config-data" (OuterVolumeSpecName: "config-data") pod "f5cbab88-2254-4d29-ae0a-d2d571ab8775" (UID: "f5cbab88-2254-4d29-ae0a-d2d571ab8775"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.612288 4707 scope.go:117] "RemoveContainer" containerID="d4d79985e6118838111d37e1e6feeca5dada1047bb4fa6fc8c107484a90a9261" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.620319 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.620347 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5cbab88-2254-4d29-ae0a-d2d571ab8775-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.631133 4707 scope.go:117] "RemoveContainer" containerID="d4e8b4f74b5106a5a95ab08fa4749e09c3806c29b650640f4d1907725152311a" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.657085 4707 scope.go:117] "RemoveContainer" containerID="f3a532bb25490d5ef26ed2e5735d30ae2fb5ff616571fb5be61dbc8d8f25f370" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.811301 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.824687 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.838909 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 27 16:52:06 crc kubenswrapper[4707]: E1127 16:52:06.839591 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-listener" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.839671 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-listener" Nov 27 16:52:06 crc kubenswrapper[4707]: E1127 16:52:06.839760 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-api" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.839816 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-api" Nov 27 16:52:06 crc kubenswrapper[4707]: E1127 16:52:06.839887 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-notifier" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.839948 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-notifier" Nov 27 16:52:06 crc kubenswrapper[4707]: E1127 16:52:06.840008 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-evaluator" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.840060 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-evaluator" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.840343 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-api" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.840468 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-notifier" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.840539 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-evaluator" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.840615 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" containerName="aodh-listener" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.844171 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.847317 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.847986 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.848492 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.848784 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pfvsr" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.849645 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 27 16:52:06 crc kubenswrapper[4707]: I1127 16:52:06.852402 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.029759 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-scripts\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.029830 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq25m\" (UniqueName: \"kubernetes.io/projected/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-kube-api-access-tq25m\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.029868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-public-tls-certs\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.030006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-config-data\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.030168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-internal-tls-certs\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.030386 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.131674 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-internal-tls-certs\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.131770 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.131817 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-scripts\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.131850 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq25m\" (UniqueName: \"kubernetes.io/projected/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-kube-api-access-tq25m\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.131881 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-public-tls-certs\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.131986 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-config-data\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.135684 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-internal-tls-certs\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.135942 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-public-tls-certs\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.136704 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.136791 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-config-data\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.137547 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-scripts\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.149625 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq25m\" (UniqueName: \"kubernetes.io/projected/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-kube-api-access-tq25m\") pod \"aodh-0\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.172819 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.206880 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5cbab88-2254-4d29-ae0a-d2d571ab8775" path="/var/lib/kubelet/pods/f5cbab88-2254-4d29-ae0a-d2d571ab8775/volumes" Nov 27 16:52:07 crc kubenswrapper[4707]: I1127 16:52:07.623446 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 16:52:08 crc kubenswrapper[4707]: I1127 16:52:08.483054 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerStarted","Data":"60645f07d0c8b4508dae87a372c98e826e694278ce8b3af733a1c724b5c2cfb4"} Nov 27 16:52:09 crc kubenswrapper[4707]: I1127 16:52:09.492341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerStarted","Data":"bb35b0cd5c08a1055926597c936749172940028c77bee5451b3745d104ddee9c"} Nov 27 16:52:10 crc kubenswrapper[4707]: I1127 16:52:10.513101 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerStarted","Data":"a0d426a992cc3358f4ccc3ed596463a566a7ba3b1e2b16bec89b5d273014e130"} Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.074711 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lw48z"] Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.082071 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.145677 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw48z"] Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.148285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fxgb\" (UniqueName: \"kubernetes.io/projected/c12af6ed-c51f-4083-9abb-abc19aad705c-kube-api-access-4fxgb\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.148438 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-catalog-content\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.148482 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-utilities\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.249481 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fxgb\" (UniqueName: \"kubernetes.io/projected/c12af6ed-c51f-4083-9abb-abc19aad705c-kube-api-access-4fxgb\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.249700 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-catalog-content\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.249747 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-utilities\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.250247 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-utilities\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.250766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-catalog-content\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.280493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fxgb\" (UniqueName: \"kubernetes.io/projected/c12af6ed-c51f-4083-9abb-abc19aad705c-kube-api-access-4fxgb\") pod \"redhat-marketplace-lw48z\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.482414 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:12 crc kubenswrapper[4707]: I1127 16:52:12.537841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerStarted","Data":"ebf927ed0f86322c1361753cfb6f6b16260a8bb532d7a757c81ee6aa66be08c4"} Nov 27 16:52:13 crc kubenswrapper[4707]: I1127 16:52:13.012833 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw48z"] Nov 27 16:52:13 crc kubenswrapper[4707]: W1127 16:52:13.018196 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc12af6ed_c51f_4083_9abb_abc19aad705c.slice/crio-33554b7559f4e38803c00d73b63b2c7adb05483a3728e28be4b475865792396d WatchSource:0}: Error finding container 33554b7559f4e38803c00d73b63b2c7adb05483a3728e28be4b475865792396d: Status 404 returned error can't find the container with id 33554b7559f4e38803c00d73b63b2c7adb05483a3728e28be4b475865792396d Nov 27 16:52:13 crc kubenswrapper[4707]: I1127 16:52:13.552404 4707 generic.go:334] "Generic (PLEG): container finished" podID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerID="05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305" exitCode=0 Nov 27 16:52:13 crc kubenswrapper[4707]: I1127 16:52:13.552503 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw48z" event={"ID":"c12af6ed-c51f-4083-9abb-abc19aad705c","Type":"ContainerDied","Data":"05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305"} Nov 27 16:52:13 crc kubenswrapper[4707]: I1127 16:52:13.552828 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw48z" event={"ID":"c12af6ed-c51f-4083-9abb-abc19aad705c","Type":"ContainerStarted","Data":"33554b7559f4e38803c00d73b63b2c7adb05483a3728e28be4b475865792396d"} Nov 27 16:52:13 crc kubenswrapper[4707]: I1127 16:52:13.560293 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerStarted","Data":"05935015aee9ab3d8dd8bfe4476e1c035ac1531e8805f0838ce1c3cbb25f9d4b"} Nov 27 16:52:13 crc kubenswrapper[4707]: I1127 16:52:13.608117 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.032947579 podStartE2EDuration="7.608100925s" podCreationTimestamp="2025-11-27 16:52:06 +0000 UTC" firstStartedPulling="2025-11-27 16:52:07.632243145 +0000 UTC m=+2903.263691953" lastFinishedPulling="2025-11-27 16:52:13.207396531 +0000 UTC m=+2908.838845299" observedRunningTime="2025-11-27 16:52:13.601411421 +0000 UTC m=+2909.232860179" watchObservedRunningTime="2025-11-27 16:52:13.608100925 +0000 UTC m=+2909.239549693" Nov 27 16:52:15 crc kubenswrapper[4707]: I1127 16:52:15.581702 4707 generic.go:334] "Generic (PLEG): container finished" podID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerID="86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb" exitCode=0 Nov 27 16:52:15 crc kubenswrapper[4707]: I1127 16:52:15.581779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw48z" event={"ID":"c12af6ed-c51f-4083-9abb-abc19aad705c","Type":"ContainerDied","Data":"86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb"} Nov 27 16:52:17 crc kubenswrapper[4707]: I1127 16:52:17.607114 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw48z" event={"ID":"c12af6ed-c51f-4083-9abb-abc19aad705c","Type":"ContainerStarted","Data":"00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128"} Nov 27 16:52:17 crc kubenswrapper[4707]: I1127 16:52:17.630025 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lw48z" podStartSLOduration=2.6465927430000002 podStartE2EDuration="5.629984781s" podCreationTimestamp="2025-11-27 16:52:12 +0000 UTC" firstStartedPulling="2025-11-27 16:52:13.556465479 +0000 UTC m=+2909.187914247" lastFinishedPulling="2025-11-27 16:52:16.539857487 +0000 UTC m=+2912.171306285" observedRunningTime="2025-11-27 16:52:17.62343784 +0000 UTC m=+2913.254886608" watchObservedRunningTime="2025-11-27 16:52:17.629984781 +0000 UTC m=+2913.261433549" Nov 27 16:52:22 crc kubenswrapper[4707]: I1127 16:52:22.483427 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:22 crc kubenswrapper[4707]: I1127 16:52:22.484131 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:22 crc kubenswrapper[4707]: I1127 16:52:22.564565 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:22 crc kubenswrapper[4707]: I1127 16:52:22.739147 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:22 crc kubenswrapper[4707]: I1127 16:52:22.809146 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw48z"] Nov 27 16:52:24 crc kubenswrapper[4707]: I1127 16:52:24.692887 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lw48z" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerName="registry-server" containerID="cri-o://00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128" gracePeriod=2 Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.293796 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.354793 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-utilities\") pod \"c12af6ed-c51f-4083-9abb-abc19aad705c\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.355105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fxgb\" (UniqueName: \"kubernetes.io/projected/c12af6ed-c51f-4083-9abb-abc19aad705c-kube-api-access-4fxgb\") pod \"c12af6ed-c51f-4083-9abb-abc19aad705c\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.355276 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-catalog-content\") pod \"c12af6ed-c51f-4083-9abb-abc19aad705c\" (UID: \"c12af6ed-c51f-4083-9abb-abc19aad705c\") " Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.356412 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-utilities" (OuterVolumeSpecName: "utilities") pod "c12af6ed-c51f-4083-9abb-abc19aad705c" (UID: "c12af6ed-c51f-4083-9abb-abc19aad705c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.364748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12af6ed-c51f-4083-9abb-abc19aad705c-kube-api-access-4fxgb" (OuterVolumeSpecName: "kube-api-access-4fxgb") pod "c12af6ed-c51f-4083-9abb-abc19aad705c" (UID: "c12af6ed-c51f-4083-9abb-abc19aad705c"). InnerVolumeSpecName "kube-api-access-4fxgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.376568 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c12af6ed-c51f-4083-9abb-abc19aad705c" (UID: "c12af6ed-c51f-4083-9abb-abc19aad705c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.459019 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.459083 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c12af6ed-c51f-4083-9abb-abc19aad705c-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.459116 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fxgb\" (UniqueName: \"kubernetes.io/projected/c12af6ed-c51f-4083-9abb-abc19aad705c-kube-api-access-4fxgb\") on node \"crc\" DevicePath \"\"" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.717457 4707 generic.go:334] "Generic (PLEG): container finished" podID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerID="00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128" exitCode=0 Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.717545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw48z" event={"ID":"c12af6ed-c51f-4083-9abb-abc19aad705c","Type":"ContainerDied","Data":"00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128"} Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.717570 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lw48z" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.717590 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw48z" event={"ID":"c12af6ed-c51f-4083-9abb-abc19aad705c","Type":"ContainerDied","Data":"33554b7559f4e38803c00d73b63b2c7adb05483a3728e28be4b475865792396d"} Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.717608 4707 scope.go:117] "RemoveContainer" containerID="00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.764581 4707 scope.go:117] "RemoveContainer" containerID="86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.779476 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw48z"] Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.790221 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw48z"] Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.819605 4707 scope.go:117] "RemoveContainer" containerID="05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.851223 4707 scope.go:117] "RemoveContainer" containerID="00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128" Nov 27 16:52:25 crc kubenswrapper[4707]: E1127 16:52:25.851846 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128\": container with ID starting with 00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128 not found: ID does not exist" containerID="00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.851881 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128"} err="failed to get container status \"00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128\": rpc error: code = NotFound desc = could not find container \"00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128\": container with ID starting with 00c222553b3c6797dfc8cc9ffaf152e0a1a75caabc1d46516a78d19b3b8de128 not found: ID does not exist" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.851906 4707 scope.go:117] "RemoveContainer" containerID="86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb" Nov 27 16:52:25 crc kubenswrapper[4707]: E1127 16:52:25.852232 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb\": container with ID starting with 86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb not found: ID does not exist" containerID="86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.852263 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb"} err="failed to get container status \"86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb\": rpc error: code = NotFound desc = could not find container \"86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb\": container with ID starting with 86f8219525f0ad19e60bcbcf5a68efbabed23d9cd780708fb17cc8ffa3d6a6cb not found: ID does not exist" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.852283 4707 scope.go:117] "RemoveContainer" containerID="05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305" Nov 27 16:52:25 crc kubenswrapper[4707]: E1127 16:52:25.852656 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305\": container with ID starting with 05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305 not found: ID does not exist" containerID="05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305" Nov 27 16:52:25 crc kubenswrapper[4707]: I1127 16:52:25.852733 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305"} err="failed to get container status \"05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305\": rpc error: code = NotFound desc = could not find container \"05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305\": container with ID starting with 05070fa23da3a79134408939fddcaecbce7d2690108ce187458495f472002305 not found: ID does not exist" Nov 27 16:52:27 crc kubenswrapper[4707]: I1127 16:52:27.236618 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" path="/var/lib/kubelet/pods/c12af6ed-c51f-4083-9abb-abc19aad705c/volumes" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.571434 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2xdph"] Nov 27 16:52:51 crc kubenswrapper[4707]: E1127 16:52:51.573144 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerName="extract-content" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.573185 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerName="extract-content" Nov 27 16:52:51 crc kubenswrapper[4707]: E1127 16:52:51.573250 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerName="extract-utilities" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.573268 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerName="extract-utilities" Nov 27 16:52:51 crc kubenswrapper[4707]: E1127 16:52:51.573352 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerName="registry-server" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.573406 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerName="registry-server" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.573916 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12af6ed-c51f-4083-9abb-abc19aad705c" containerName="registry-server" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.577222 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.580783 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2xdph"] Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.767960 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-catalog-content\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.768019 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dn6j\" (UniqueName: \"kubernetes.io/projected/e9acc030-1a08-452d-be90-67dc79de3a26-kube-api-access-4dn6j\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.768098 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-utilities\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.870566 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-utilities\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.870742 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-catalog-content\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.870784 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dn6j\" (UniqueName: \"kubernetes.io/projected/e9acc030-1a08-452d-be90-67dc79de3a26-kube-api-access-4dn6j\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.871450 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-utilities\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.871472 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-catalog-content\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.891364 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dn6j\" (UniqueName: \"kubernetes.io/projected/e9acc030-1a08-452d-be90-67dc79de3a26-kube-api-access-4dn6j\") pod \"redhat-operators-2xdph\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:51 crc kubenswrapper[4707]: I1127 16:52:51.902240 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:52:52 crc kubenswrapper[4707]: I1127 16:52:52.387257 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2xdph"] Nov 27 16:52:52 crc kubenswrapper[4707]: I1127 16:52:52.991475 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9acc030-1a08-452d-be90-67dc79de3a26" containerID="91f9e248a4093df7c0d7bd7d098ed81eb1b7e40b6819ce58103e80567748d2e8" exitCode=0 Nov 27 16:52:52 crc kubenswrapper[4707]: I1127 16:52:52.991549 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xdph" event={"ID":"e9acc030-1a08-452d-be90-67dc79de3a26","Type":"ContainerDied","Data":"91f9e248a4093df7c0d7bd7d098ed81eb1b7e40b6819ce58103e80567748d2e8"} Nov 27 16:52:52 crc kubenswrapper[4707]: I1127 16:52:52.991779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xdph" event={"ID":"e9acc030-1a08-452d-be90-67dc79de3a26","Type":"ContainerStarted","Data":"f48eeb479ffc8692e8c86dae635dc6ac91bcb63cfd089b3b462ea32eb10c1b04"} Nov 27 16:52:55 crc kubenswrapper[4707]: I1127 16:52:55.011092 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xdph" event={"ID":"e9acc030-1a08-452d-be90-67dc79de3a26","Type":"ContainerStarted","Data":"0cb166b187380657a960eeb19bac0fa58f295212097d997afa1f974cbd6678f4"} Nov 27 16:52:56 crc kubenswrapper[4707]: I1127 16:52:56.032019 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9acc030-1a08-452d-be90-67dc79de3a26" containerID="0cb166b187380657a960eeb19bac0fa58f295212097d997afa1f974cbd6678f4" exitCode=0 Nov 27 16:52:56 crc kubenswrapper[4707]: I1127 16:52:56.032459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xdph" event={"ID":"e9acc030-1a08-452d-be90-67dc79de3a26","Type":"ContainerDied","Data":"0cb166b187380657a960eeb19bac0fa58f295212097d997afa1f974cbd6678f4"} Nov 27 16:53:01 crc kubenswrapper[4707]: I1127 16:53:01.089644 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xdph" event={"ID":"e9acc030-1a08-452d-be90-67dc79de3a26","Type":"ContainerStarted","Data":"3014672766d8d64387c0bfbcdeb80865d419f13b26f62c1c520bcc4835a2fd04"} Nov 27 16:53:01 crc kubenswrapper[4707]: I1127 16:53:01.119953 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2xdph" podStartSLOduration=2.585888546 podStartE2EDuration="10.119932673s" podCreationTimestamp="2025-11-27 16:52:51 +0000 UTC" firstStartedPulling="2025-11-27 16:52:52.994109378 +0000 UTC m=+2948.625558156" lastFinishedPulling="2025-11-27 16:53:00.528153515 +0000 UTC m=+2956.159602283" observedRunningTime="2025-11-27 16:53:01.112104131 +0000 UTC m=+2956.743552899" watchObservedRunningTime="2025-11-27 16:53:01.119932673 +0000 UTC m=+2956.751381441" Nov 27 16:53:01 crc kubenswrapper[4707]: I1127 16:53:01.903280 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:53:01 crc kubenswrapper[4707]: I1127 16:53:01.903322 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:53:02 crc kubenswrapper[4707]: I1127 16:53:02.960867 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2xdph" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="registry-server" probeResult="failure" output=< Nov 27 16:53:02 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Nov 27 16:53:02 crc kubenswrapper[4707]: > Nov 27 16:53:03 crc kubenswrapper[4707]: I1127 16:53:03.623757 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:53:03 crc kubenswrapper[4707]: I1127 16:53:03.623818 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:53:11 crc kubenswrapper[4707]: I1127 16:53:11.964245 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:53:12 crc kubenswrapper[4707]: I1127 16:53:12.015419 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:53:12 crc kubenswrapper[4707]: I1127 16:53:12.201362 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2xdph"] Nov 27 16:53:13 crc kubenswrapper[4707]: I1127 16:53:13.227977 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2xdph" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="registry-server" containerID="cri-o://3014672766d8d64387c0bfbcdeb80865d419f13b26f62c1c520bcc4835a2fd04" gracePeriod=2 Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.238840 4707 generic.go:334] "Generic (PLEG): container finished" podID="e9acc030-1a08-452d-be90-67dc79de3a26" containerID="3014672766d8d64387c0bfbcdeb80865d419f13b26f62c1c520bcc4835a2fd04" exitCode=0 Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.238913 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xdph" event={"ID":"e9acc030-1a08-452d-be90-67dc79de3a26","Type":"ContainerDied","Data":"3014672766d8d64387c0bfbcdeb80865d419f13b26f62c1c520bcc4835a2fd04"} Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.348149 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.479997 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dn6j\" (UniqueName: \"kubernetes.io/projected/e9acc030-1a08-452d-be90-67dc79de3a26-kube-api-access-4dn6j\") pod \"e9acc030-1a08-452d-be90-67dc79de3a26\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.480105 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-catalog-content\") pod \"e9acc030-1a08-452d-be90-67dc79de3a26\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.480160 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-utilities\") pod \"e9acc030-1a08-452d-be90-67dc79de3a26\" (UID: \"e9acc030-1a08-452d-be90-67dc79de3a26\") " Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.481654 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-utilities" (OuterVolumeSpecName: "utilities") pod "e9acc030-1a08-452d-be90-67dc79de3a26" (UID: "e9acc030-1a08-452d-be90-67dc79de3a26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.486253 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9acc030-1a08-452d-be90-67dc79de3a26-kube-api-access-4dn6j" (OuterVolumeSpecName: "kube-api-access-4dn6j") pod "e9acc030-1a08-452d-be90-67dc79de3a26" (UID: "e9acc030-1a08-452d-be90-67dc79de3a26"). InnerVolumeSpecName "kube-api-access-4dn6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.584558 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dn6j\" (UniqueName: \"kubernetes.io/projected/e9acc030-1a08-452d-be90-67dc79de3a26-kube-api-access-4dn6j\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.584596 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.593770 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9acc030-1a08-452d-be90-67dc79de3a26" (UID: "e9acc030-1a08-452d-be90-67dc79de3a26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:53:14 crc kubenswrapper[4707]: I1127 16:53:14.687458 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9acc030-1a08-452d-be90-67dc79de3a26-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:53:15 crc kubenswrapper[4707]: I1127 16:53:15.253423 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xdph" event={"ID":"e9acc030-1a08-452d-be90-67dc79de3a26","Type":"ContainerDied","Data":"f48eeb479ffc8692e8c86dae635dc6ac91bcb63cfd089b3b462ea32eb10c1b04"} Nov 27 16:53:15 crc kubenswrapper[4707]: I1127 16:53:15.253475 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xdph" Nov 27 16:53:15 crc kubenswrapper[4707]: I1127 16:53:15.253492 4707 scope.go:117] "RemoveContainer" containerID="3014672766d8d64387c0bfbcdeb80865d419f13b26f62c1c520bcc4835a2fd04" Nov 27 16:53:15 crc kubenswrapper[4707]: I1127 16:53:15.287556 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2xdph"] Nov 27 16:53:15 crc kubenswrapper[4707]: I1127 16:53:15.289174 4707 scope.go:117] "RemoveContainer" containerID="0cb166b187380657a960eeb19bac0fa58f295212097d997afa1f974cbd6678f4" Nov 27 16:53:15 crc kubenswrapper[4707]: I1127 16:53:15.297049 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2xdph"] Nov 27 16:53:15 crc kubenswrapper[4707]: I1127 16:53:15.331949 4707 scope.go:117] "RemoveContainer" containerID="91f9e248a4093df7c0d7bd7d098ed81eb1b7e40b6819ce58103e80567748d2e8" Nov 27 16:53:17 crc kubenswrapper[4707]: I1127 16:53:17.209818 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" path="/var/lib/kubelet/pods/e9acc030-1a08-452d-be90-67dc79de3a26/volumes" Nov 27 16:53:33 crc kubenswrapper[4707]: I1127 16:53:33.624181 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:53:33 crc kubenswrapper[4707]: I1127 16:53:33.624630 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:54:03 crc kubenswrapper[4707]: I1127 16:54:03.624152 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 16:54:03 crc kubenswrapper[4707]: I1127 16:54:03.625598 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 16:54:03 crc kubenswrapper[4707]: I1127 16:54:03.625950 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 16:54:03 crc kubenswrapper[4707]: I1127 16:54:03.628130 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 16:54:03 crc kubenswrapper[4707]: I1127 16:54:03.628291 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" gracePeriod=600 Nov 27 16:54:03 crc kubenswrapper[4707]: E1127 16:54:03.769781 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:54:04 crc kubenswrapper[4707]: I1127 16:54:04.767606 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" exitCode=0 Nov 27 16:54:04 crc kubenswrapper[4707]: I1127 16:54:04.767678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084"} Nov 27 16:54:04 crc kubenswrapper[4707]: I1127 16:54:04.767752 4707 scope.go:117] "RemoveContainer" containerID="e81264f6a6a3db0f1a2727a11a97e72a9ebad7ddc0f52e59880a37155bb6a287" Nov 27 16:54:04 crc kubenswrapper[4707]: I1127 16:54:04.768655 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:54:04 crc kubenswrapper[4707]: E1127 16:54:04.769306 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:54:18 crc kubenswrapper[4707]: I1127 16:54:18.194821 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:54:18 crc kubenswrapper[4707]: E1127 16:54:18.195556 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.339299 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dr2rg"] Nov 27 16:54:20 crc kubenswrapper[4707]: E1127 16:54:20.340526 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="extract-content" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.340553 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="extract-content" Nov 27 16:54:20 crc kubenswrapper[4707]: E1127 16:54:20.340646 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="extract-utilities" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.340667 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="extract-utilities" Nov 27 16:54:20 crc kubenswrapper[4707]: E1127 16:54:20.340697 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="registry-server" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.340716 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="registry-server" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.341073 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9acc030-1a08-452d-be90-67dc79de3a26" containerName="registry-server" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.343992 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.349938 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dr2rg"] Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.431932 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-catalog-content\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.432160 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnvb5\" (UniqueName: \"kubernetes.io/projected/e6943c0c-9556-4d59-b668-8c1be1a6a349-kube-api-access-fnvb5\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.432181 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-utilities\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.533355 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnvb5\" (UniqueName: \"kubernetes.io/projected/e6943c0c-9556-4d59-b668-8c1be1a6a349-kube-api-access-fnvb5\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.533417 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-utilities\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.533455 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-catalog-content\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.534040 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-catalog-content\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.534041 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-utilities\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.551226 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnvb5\" (UniqueName: \"kubernetes.io/projected/e6943c0c-9556-4d59-b668-8c1be1a6a349-kube-api-access-fnvb5\") pod \"community-operators-dr2rg\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:20 crc kubenswrapper[4707]: I1127 16:54:20.681476 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:21 crc kubenswrapper[4707]: I1127 16:54:21.189670 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dr2rg"] Nov 27 16:54:21 crc kubenswrapper[4707]: I1127 16:54:21.980771 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerID="9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf" exitCode=0 Nov 27 16:54:21 crc kubenswrapper[4707]: I1127 16:54:21.980836 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dr2rg" event={"ID":"e6943c0c-9556-4d59-b668-8c1be1a6a349","Type":"ContainerDied","Data":"9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf"} Nov 27 16:54:21 crc kubenswrapper[4707]: I1127 16:54:21.981149 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dr2rg" event={"ID":"e6943c0c-9556-4d59-b668-8c1be1a6a349","Type":"ContainerStarted","Data":"845a1256c7032bfa6c33f353ca1415c714112688f7cfdacc3fd2507cdb8c86d3"} Nov 27 16:54:24 crc kubenswrapper[4707]: I1127 16:54:24.007098 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dr2rg" event={"ID":"e6943c0c-9556-4d59-b668-8c1be1a6a349","Type":"ContainerStarted","Data":"cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848"} Nov 27 16:54:25 crc kubenswrapper[4707]: I1127 16:54:25.022885 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerID="cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848" exitCode=0 Nov 27 16:54:25 crc kubenswrapper[4707]: I1127 16:54:25.022960 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dr2rg" event={"ID":"e6943c0c-9556-4d59-b668-8c1be1a6a349","Type":"ContainerDied","Data":"cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848"} Nov 27 16:54:27 crc kubenswrapper[4707]: I1127 16:54:27.045701 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dr2rg" event={"ID":"e6943c0c-9556-4d59-b668-8c1be1a6a349","Type":"ContainerStarted","Data":"533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d"} Nov 27 16:54:27 crc kubenswrapper[4707]: I1127 16:54:27.065561 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dr2rg" podStartSLOduration=3.086835484 podStartE2EDuration="7.065540823s" podCreationTimestamp="2025-11-27 16:54:20 +0000 UTC" firstStartedPulling="2025-11-27 16:54:21.983507165 +0000 UTC m=+3037.614955933" lastFinishedPulling="2025-11-27 16:54:25.962212514 +0000 UTC m=+3041.593661272" observedRunningTime="2025-11-27 16:54:27.062244562 +0000 UTC m=+3042.693693340" watchObservedRunningTime="2025-11-27 16:54:27.065540823 +0000 UTC m=+3042.696989601" Nov 27 16:54:30 crc kubenswrapper[4707]: I1127 16:54:30.682545 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:30 crc kubenswrapper[4707]: I1127 16:54:30.682877 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:30 crc kubenswrapper[4707]: I1127 16:54:30.760849 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:31 crc kubenswrapper[4707]: I1127 16:54:31.180117 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:31 crc kubenswrapper[4707]: I1127 16:54:31.255315 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dr2rg"] Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.116082 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dr2rg" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerName="registry-server" containerID="cri-o://533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d" gracePeriod=2 Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.195741 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:54:33 crc kubenswrapper[4707]: E1127 16:54:33.196356 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.591216 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.723238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-utilities\") pod \"e6943c0c-9556-4d59-b668-8c1be1a6a349\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.724735 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnvb5\" (UniqueName: \"kubernetes.io/projected/e6943c0c-9556-4d59-b668-8c1be1a6a349-kube-api-access-fnvb5\") pod \"e6943c0c-9556-4d59-b668-8c1be1a6a349\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.725074 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-utilities" (OuterVolumeSpecName: "utilities") pod "e6943c0c-9556-4d59-b668-8c1be1a6a349" (UID: "e6943c0c-9556-4d59-b668-8c1be1a6a349"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.725316 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-catalog-content\") pod \"e6943c0c-9556-4d59-b668-8c1be1a6a349\" (UID: \"e6943c0c-9556-4d59-b668-8c1be1a6a349\") " Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.726435 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.732059 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6943c0c-9556-4d59-b668-8c1be1a6a349-kube-api-access-fnvb5" (OuterVolumeSpecName: "kube-api-access-fnvb5") pod "e6943c0c-9556-4d59-b668-8c1be1a6a349" (UID: "e6943c0c-9556-4d59-b668-8c1be1a6a349"). InnerVolumeSpecName "kube-api-access-fnvb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.795990 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6943c0c-9556-4d59-b668-8c1be1a6a349" (UID: "e6943c0c-9556-4d59-b668-8c1be1a6a349"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.829377 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnvb5\" (UniqueName: \"kubernetes.io/projected/e6943c0c-9556-4d59-b668-8c1be1a6a349-kube-api-access-fnvb5\") on node \"crc\" DevicePath \"\"" Nov 27 16:54:33 crc kubenswrapper[4707]: I1127 16:54:33.829411 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6943c0c-9556-4d59-b668-8c1be1a6a349-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.133887 4707 generic.go:334] "Generic (PLEG): container finished" podID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerID="533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d" exitCode=0 Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.134013 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dr2rg" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.134036 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dr2rg" event={"ID":"e6943c0c-9556-4d59-b668-8c1be1a6a349","Type":"ContainerDied","Data":"533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d"} Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.135631 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dr2rg" event={"ID":"e6943c0c-9556-4d59-b668-8c1be1a6a349","Type":"ContainerDied","Data":"845a1256c7032bfa6c33f353ca1415c714112688f7cfdacc3fd2507cdb8c86d3"} Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.135674 4707 scope.go:117] "RemoveContainer" containerID="533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.177876 4707 scope.go:117] "RemoveContainer" containerID="cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.190315 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dr2rg"] Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.201893 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dr2rg"] Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.214087 4707 scope.go:117] "RemoveContainer" containerID="9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.291037 4707 scope.go:117] "RemoveContainer" containerID="533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d" Nov 27 16:54:34 crc kubenswrapper[4707]: E1127 16:54:34.291885 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d\": container with ID starting with 533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d not found: ID does not exist" containerID="533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.291947 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d"} err="failed to get container status \"533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d\": rpc error: code = NotFound desc = could not find container \"533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d\": container with ID starting with 533a9803d9114d3c2203e8c172904876b04290be5f85fcf8bd99bbf892e60b7d not found: ID does not exist" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.291982 4707 scope.go:117] "RemoveContainer" containerID="cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848" Nov 27 16:54:34 crc kubenswrapper[4707]: E1127 16:54:34.292594 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848\": container with ID starting with cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848 not found: ID does not exist" containerID="cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.292627 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848"} err="failed to get container status \"cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848\": rpc error: code = NotFound desc = could not find container \"cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848\": container with ID starting with cf33ad7f0ee7a326bf563c68c421a8f67e1788c7502ee7208a7e055ed6402848 not found: ID does not exist" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.292651 4707 scope.go:117] "RemoveContainer" containerID="9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf" Nov 27 16:54:34 crc kubenswrapper[4707]: E1127 16:54:34.293185 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf\": container with ID starting with 9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf not found: ID does not exist" containerID="9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf" Nov 27 16:54:34 crc kubenswrapper[4707]: I1127 16:54:34.293228 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf"} err="failed to get container status \"9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf\": rpc error: code = NotFound desc = could not find container \"9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf\": container with ID starting with 9d0cc3d91e38329230c66de2568d55fdc3ccc72a19aa242119e8db2bc35f0ecf not found: ID does not exist" Nov 27 16:54:35 crc kubenswrapper[4707]: I1127 16:54:35.214916 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" path="/var/lib/kubelet/pods/e6943c0c-9556-4d59-b668-8c1be1a6a349/volumes" Nov 27 16:54:45 crc kubenswrapper[4707]: I1127 16:54:45.203287 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:54:45 crc kubenswrapper[4707]: E1127 16:54:45.204571 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:54:58 crc kubenswrapper[4707]: I1127 16:54:58.508774 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/manager/0.log" Nov 27 16:55:00 crc kubenswrapper[4707]: I1127 16:55:00.196005 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:55:00 crc kubenswrapper[4707]: E1127 16:55:00.196582 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.633308 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9"] Nov 27 16:55:11 crc kubenswrapper[4707]: E1127 16:55:11.634441 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerName="extract-utilities" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.634466 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerName="extract-utilities" Nov 27 16:55:11 crc kubenswrapper[4707]: E1127 16:55:11.634522 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerName="extract-content" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.634533 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerName="extract-content" Nov 27 16:55:11 crc kubenswrapper[4707]: E1127 16:55:11.634547 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerName="registry-server" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.634558 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerName="registry-server" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.634884 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6943c0c-9556-4d59-b668-8c1be1a6a349" containerName="registry-server" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.636751 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.638832 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.644683 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9"] Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.769922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.770020 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.770135 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqjb\" (UniqueName: \"kubernetes.io/projected/578725b0-80a1-4da6-93ae-243ae76cd1b6-kube-api-access-sjqjb\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.872769 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqjb\" (UniqueName: \"kubernetes.io/projected/578725b0-80a1-4da6-93ae-243ae76cd1b6-kube-api-access-sjqjb\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.873102 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.873167 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.873613 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.873932 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.911611 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqjb\" (UniqueName: \"kubernetes.io/projected/578725b0-80a1-4da6-93ae-243ae76cd1b6-kube-api-access-sjqjb\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:11 crc kubenswrapper[4707]: I1127 16:55:11.983003 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:12 crc kubenswrapper[4707]: I1127 16:55:12.194689 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:55:12 crc kubenswrapper[4707]: E1127 16:55:12.195077 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:55:12 crc kubenswrapper[4707]: I1127 16:55:12.463037 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9"] Nov 27 16:55:12 crc kubenswrapper[4707]: I1127 16:55:12.587848 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" event={"ID":"578725b0-80a1-4da6-93ae-243ae76cd1b6","Type":"ContainerStarted","Data":"d187c3f8b658da07a475ae188e0eef19c9cccd2870cbc5e995cfb266867ff7a2"} Nov 27 16:55:13 crc kubenswrapper[4707]: I1127 16:55:13.599642 4707 generic.go:334] "Generic (PLEG): container finished" podID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerID="43a39e35fb7f49a9efc73dec72d774ddc1071863aae332465f2be7246c24d05f" exitCode=0 Nov 27 16:55:13 crc kubenswrapper[4707]: I1127 16:55:13.599760 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" event={"ID":"578725b0-80a1-4da6-93ae-243ae76cd1b6","Type":"ContainerDied","Data":"43a39e35fb7f49a9efc73dec72d774ddc1071863aae332465f2be7246c24d05f"} Nov 27 16:55:13 crc kubenswrapper[4707]: I1127 16:55:13.602542 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 16:55:16 crc kubenswrapper[4707]: I1127 16:55:16.639492 4707 generic.go:334] "Generic (PLEG): container finished" podID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerID="d89a906c76e9d9a5f89d6fee1c41597ede6afcd725717187f70a170b5eca6b80" exitCode=0 Nov 27 16:55:16 crc kubenswrapper[4707]: I1127 16:55:16.639558 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" event={"ID":"578725b0-80a1-4da6-93ae-243ae76cd1b6","Type":"ContainerDied","Data":"d89a906c76e9d9a5f89d6fee1c41597ede6afcd725717187f70a170b5eca6b80"} Nov 27 16:55:17 crc kubenswrapper[4707]: I1127 16:55:17.656415 4707 generic.go:334] "Generic (PLEG): container finished" podID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerID="3416ed727e3de5449992de9c00890d5b53e952736fb26941966048fcb31d8a7f" exitCode=0 Nov 27 16:55:17 crc kubenswrapper[4707]: I1127 16:55:17.656508 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" event={"ID":"578725b0-80a1-4da6-93ae-243ae76cd1b6","Type":"ContainerDied","Data":"3416ed727e3de5449992de9c00890d5b53e952736fb26941966048fcb31d8a7f"} Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.054927 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.123847 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-bundle\") pod \"578725b0-80a1-4da6-93ae-243ae76cd1b6\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.124235 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-util\") pod \"578725b0-80a1-4da6-93ae-243ae76cd1b6\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.124474 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjqjb\" (UniqueName: \"kubernetes.io/projected/578725b0-80a1-4da6-93ae-243ae76cd1b6-kube-api-access-sjqjb\") pod \"578725b0-80a1-4da6-93ae-243ae76cd1b6\" (UID: \"578725b0-80a1-4da6-93ae-243ae76cd1b6\") " Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.128394 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-bundle" (OuterVolumeSpecName: "bundle") pod "578725b0-80a1-4da6-93ae-243ae76cd1b6" (UID: "578725b0-80a1-4da6-93ae-243ae76cd1b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.131944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578725b0-80a1-4da6-93ae-243ae76cd1b6-kube-api-access-sjqjb" (OuterVolumeSpecName: "kube-api-access-sjqjb") pod "578725b0-80a1-4da6-93ae-243ae76cd1b6" (UID: "578725b0-80a1-4da6-93ae-243ae76cd1b6"). InnerVolumeSpecName "kube-api-access-sjqjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.138854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-util" (OuterVolumeSpecName: "util") pod "578725b0-80a1-4da6-93ae-243ae76cd1b6" (UID: "578725b0-80a1-4da6-93ae-243ae76cd1b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.226835 4707 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.226871 4707 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/578725b0-80a1-4da6-93ae-243ae76cd1b6-util\") on node \"crc\" DevicePath \"\"" Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.226920 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjqjb\" (UniqueName: \"kubernetes.io/projected/578725b0-80a1-4da6-93ae-243ae76cd1b6-kube-api-access-sjqjb\") on node \"crc\" DevicePath \"\"" Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.685227 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" event={"ID":"578725b0-80a1-4da6-93ae-243ae76cd1b6","Type":"ContainerDied","Data":"d187c3f8b658da07a475ae188e0eef19c9cccd2870cbc5e995cfb266867ff7a2"} Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.685752 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d187c3f8b658da07a475ae188e0eef19c9cccd2870cbc5e995cfb266867ff7a2" Nov 27 16:55:19 crc kubenswrapper[4707]: I1127 16:55:19.685858 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9" Nov 27 16:55:23 crc kubenswrapper[4707]: I1127 16:55:23.194961 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:55:23 crc kubenswrapper[4707]: E1127 16:55:23.195614 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.356130 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6"] Nov 27 16:55:30 crc kubenswrapper[4707]: E1127 16:55:30.357275 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerName="pull" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.357292 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerName="pull" Nov 27 16:55:30 crc kubenswrapper[4707]: E1127 16:55:30.357315 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerName="extract" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.357323 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerName="extract" Nov 27 16:55:30 crc kubenswrapper[4707]: E1127 16:55:30.357355 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerName="util" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.357363 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerName="util" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.357782 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="578725b0-80a1-4da6-93ae-243ae76cd1b6" containerName="extract" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.358635 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.363151 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.365835 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-ggqld" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.368348 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.372914 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.468586 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnbkw\" (UniqueName: \"kubernetes.io/projected/fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa-kube-api-access-vnbkw\") pod \"obo-prometheus-operator-668cf9dfbb-zwnr6\" (UID: \"fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.499246 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.500459 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.515099 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qwcvh" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.515940 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.534787 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.536333 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.565392 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.571256 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1ab71b3-d476-4099-b4d8-c59b437dd4f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz\" (UID: \"e1ab71b3-d476-4099-b4d8-c59b437dd4f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.571349 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1ab71b3-d476-4099-b4d8-c59b437dd4f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz\" (UID: \"e1ab71b3-d476-4099-b4d8-c59b437dd4f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.571426 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4208373c-9a43-406a-8262-0aff47b60f85-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp\" (UID: \"4208373c-9a43-406a-8262-0aff47b60f85\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.571480 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4208373c-9a43-406a-8262-0aff47b60f85-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp\" (UID: \"4208373c-9a43-406a-8262-0aff47b60f85\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.571518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnbkw\" (UniqueName: \"kubernetes.io/projected/fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa-kube-api-access-vnbkw\") pod \"obo-prometheus-operator-668cf9dfbb-zwnr6\" (UID: \"fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.576571 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.617431 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnbkw\" (UniqueName: \"kubernetes.io/projected/fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa-kube-api-access-vnbkw\") pod \"obo-prometheus-operator-668cf9dfbb-zwnr6\" (UID: \"fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.673794 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1ab71b3-d476-4099-b4d8-c59b437dd4f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz\" (UID: \"e1ab71b3-d476-4099-b4d8-c59b437dd4f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.673896 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4208373c-9a43-406a-8262-0aff47b60f85-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp\" (UID: \"4208373c-9a43-406a-8262-0aff47b60f85\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.673982 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4208373c-9a43-406a-8262-0aff47b60f85-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp\" (UID: \"4208373c-9a43-406a-8262-0aff47b60f85\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.674150 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1ab71b3-d476-4099-b4d8-c59b437dd4f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz\" (UID: \"e1ab71b3-d476-4099-b4d8-c59b437dd4f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.679142 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e1ab71b3-d476-4099-b4d8-c59b437dd4f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz\" (UID: \"e1ab71b3-d476-4099-b4d8-c59b437dd4f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.679987 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4208373c-9a43-406a-8262-0aff47b60f85-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp\" (UID: \"4208373c-9a43-406a-8262-0aff47b60f85\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.679991 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4208373c-9a43-406a-8262-0aff47b60f85-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp\" (UID: \"4208373c-9a43-406a-8262-0aff47b60f85\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.680193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e1ab71b3-d476-4099-b4d8-c59b437dd4f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz\" (UID: \"e1ab71b3-d476-4099-b4d8-c59b437dd4f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.683232 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.702760 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-82wd5"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.704591 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.708253 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-hbrfg" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.708832 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.716831 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-82wd5"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.777704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e96caaf7-dd4f-4734-8601-23d38df9005f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-82wd5\" (UID: \"e96caaf7-dd4f-4734-8601-23d38df9005f\") " pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.777788 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j662z\" (UniqueName: \"kubernetes.io/projected/e96caaf7-dd4f-4734-8601-23d38df9005f-kube-api-access-j662z\") pod \"observability-operator-d8bb48f5d-82wd5\" (UID: \"e96caaf7-dd4f-4734-8601-23d38df9005f\") " pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.788571 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-jgbmm"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.790045 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.797990 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gswdv" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.822748 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-jgbmm"] Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.837393 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.872877 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.882037 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e96caaf7-dd4f-4734-8601-23d38df9005f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-82wd5\" (UID: \"e96caaf7-dd4f-4734-8601-23d38df9005f\") " pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.882099 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j662z\" (UniqueName: \"kubernetes.io/projected/e96caaf7-dd4f-4734-8601-23d38df9005f-kube-api-access-j662z\") pod \"observability-operator-d8bb48f5d-82wd5\" (UID: \"e96caaf7-dd4f-4734-8601-23d38df9005f\") " pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.882179 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kggrv\" (UniqueName: \"kubernetes.io/projected/0c162a13-e55c-44f0-9ab7-cc7ce6d87605-kube-api-access-kggrv\") pod \"perses-operator-5446b9c989-jgbmm\" (UID: \"0c162a13-e55c-44f0-9ab7-cc7ce6d87605\") " pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.882244 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c162a13-e55c-44f0-9ab7-cc7ce6d87605-openshift-service-ca\") pod \"perses-operator-5446b9c989-jgbmm\" (UID: \"0c162a13-e55c-44f0-9ab7-cc7ce6d87605\") " pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.890426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e96caaf7-dd4f-4734-8601-23d38df9005f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-82wd5\" (UID: \"e96caaf7-dd4f-4734-8601-23d38df9005f\") " pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.901694 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j662z\" (UniqueName: \"kubernetes.io/projected/e96caaf7-dd4f-4734-8601-23d38df9005f-kube-api-access-j662z\") pod \"observability-operator-d8bb48f5d-82wd5\" (UID: \"e96caaf7-dd4f-4734-8601-23d38df9005f\") " pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.984229 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kggrv\" (UniqueName: \"kubernetes.io/projected/0c162a13-e55c-44f0-9ab7-cc7ce6d87605-kube-api-access-kggrv\") pod \"perses-operator-5446b9c989-jgbmm\" (UID: \"0c162a13-e55c-44f0-9ab7-cc7ce6d87605\") " pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.984310 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c162a13-e55c-44f0-9ab7-cc7ce6d87605-openshift-service-ca\") pod \"perses-operator-5446b9c989-jgbmm\" (UID: \"0c162a13-e55c-44f0-9ab7-cc7ce6d87605\") " pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:55:30 crc kubenswrapper[4707]: I1127 16:55:30.985179 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c162a13-e55c-44f0-9ab7-cc7ce6d87605-openshift-service-ca\") pod \"perses-operator-5446b9c989-jgbmm\" (UID: \"0c162a13-e55c-44f0-9ab7-cc7ce6d87605\") " pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.005046 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kggrv\" (UniqueName: \"kubernetes.io/projected/0c162a13-e55c-44f0-9ab7-cc7ce6d87605-kube-api-access-kggrv\") pod \"perses-operator-5446b9c989-jgbmm\" (UID: \"0c162a13-e55c-44f0-9ab7-cc7ce6d87605\") " pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.115206 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.126966 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.299114 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6"] Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.596678 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp"] Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.752746 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz"] Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.823050 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" event={"ID":"fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa","Type":"ContainerStarted","Data":"df20c0586d708f80175e2c2f5d0f1d8cedb750b8c7291fcf7ac1705ef6fd26a7"} Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.831300 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" event={"ID":"4208373c-9a43-406a-8262-0aff47b60f85","Type":"ContainerStarted","Data":"6b88d64ced072695ed93be6ee0b76e3cc23adfbf9cd65c669248c3d436deabe7"} Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.833414 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" event={"ID":"e1ab71b3-d476-4099-b4d8-c59b437dd4f7","Type":"ContainerStarted","Data":"aedb75a5e8e0d826a5601472d5915a42cd2cbc313d7b8e2465954e0ba671435d"} Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.882937 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-jgbmm"] Nov 27 16:55:31 crc kubenswrapper[4707]: I1127 16:55:31.900520 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-82wd5"] Nov 27 16:55:31 crc kubenswrapper[4707]: W1127 16:55:31.909494 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode96caaf7_dd4f_4734_8601_23d38df9005f.slice/crio-816fcd4e21b284ebdc494d6aa3662b79715df7ef15a45a04e48b03158c67689f WatchSource:0}: Error finding container 816fcd4e21b284ebdc494d6aa3662b79715df7ef15a45a04e48b03158c67689f: Status 404 returned error can't find the container with id 816fcd4e21b284ebdc494d6aa3662b79715df7ef15a45a04e48b03158c67689f Nov 27 16:55:32 crc kubenswrapper[4707]: I1127 16:55:32.878605 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" event={"ID":"e96caaf7-dd4f-4734-8601-23d38df9005f","Type":"ContainerStarted","Data":"816fcd4e21b284ebdc494d6aa3662b79715df7ef15a45a04e48b03158c67689f"} Nov 27 16:55:32 crc kubenswrapper[4707]: I1127 16:55:32.888241 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" event={"ID":"0c162a13-e55c-44f0-9ab7-cc7ce6d87605","Type":"ContainerStarted","Data":"7293173e1f2210b23b4a66d788cc97cc5ba76932c2d2d20815ccb5c1566b563f"} Nov 27 16:55:36 crc kubenswrapper[4707]: I1127 16:55:36.195706 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:55:36 crc kubenswrapper[4707]: E1127 16:55:36.196279 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:55:50 crc kubenswrapper[4707]: E1127 16:55:50.151618 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Nov 27 16:55:50 crc kubenswrapper[4707]: E1127 16:55:50.152441 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j662z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-82wd5_openshift-operators(e96caaf7-dd4f-4734-8601-23d38df9005f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:55:50 crc kubenswrapper[4707]: E1127 16:55:50.153756 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" podUID="e96caaf7-dd4f-4734-8601-23d38df9005f" Nov 27 16:55:51 crc kubenswrapper[4707]: E1127 16:55:51.167351 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" podUID="e96caaf7-dd4f-4734-8601-23d38df9005f" Nov 27 16:55:51 crc kubenswrapper[4707]: I1127 16:55:51.197464 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:55:51 crc kubenswrapper[4707]: E1127 16:55:51.197731 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:55:51 crc kubenswrapper[4707]: E1127 16:55:51.462173 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Nov 27 16:55:51 crc kubenswrapper[4707]: E1127 16:55:51.462638 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp_openshift-operators(4208373c-9a43-406a-8262-0aff47b60f85): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:55:51 crc kubenswrapper[4707]: E1127 16:55:51.464014 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" podUID="4208373c-9a43-406a-8262-0aff47b60f85" Nov 27 16:55:52 crc kubenswrapper[4707]: E1127 16:55:52.174448 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" podUID="4208373c-9a43-406a-8262-0aff47b60f85" Nov 27 16:55:52 crc kubenswrapper[4707]: E1127 16:55:52.546989 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385" Nov 27 16:55:52 crc kubenswrapper[4707]: E1127 16:55:52.547200 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kggrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5446b9c989-jgbmm_openshift-operators(0c162a13-e55c-44f0-9ab7-cc7ce6d87605): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:55:52 crc kubenswrapper[4707]: E1127 16:55:52.548416 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" podUID="0c162a13-e55c-44f0-9ab7-cc7ce6d87605" Nov 27 16:55:53 crc kubenswrapper[4707]: E1127 16:55:53.182537 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385\\\"\"" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" podUID="0c162a13-e55c-44f0-9ab7-cc7ce6d87605" Nov 27 16:55:53 crc kubenswrapper[4707]: E1127 16:55:53.632801 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Nov 27 16:55:53 crc kubenswrapper[4707]: E1127 16:55:53.632998 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vnbkw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-zwnr6_openshift-operators(fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:55:53 crc kubenswrapper[4707]: E1127 16:55:53.634259 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" podUID="fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa" Nov 27 16:55:53 crc kubenswrapper[4707]: E1127 16:55:53.694563 4707 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Nov 27 16:55:53 crc kubenswrapper[4707]: E1127 16:55:53.694742 4707 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz_openshift-operators(e1ab71b3-d476-4099-b4d8-c59b437dd4f7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 16:55:53 crc kubenswrapper[4707]: E1127 16:55:53.695985 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" podUID="e1ab71b3-d476-4099-b4d8-c59b437dd4f7" Nov 27 16:55:54 crc kubenswrapper[4707]: E1127 16:55:54.190804 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" podUID="fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa" Nov 27 16:55:54 crc kubenswrapper[4707]: E1127 16:55:54.191135 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" podUID="e1ab71b3-d476-4099-b4d8-c59b437dd4f7" Nov 27 16:55:57 crc kubenswrapper[4707]: I1127 16:55:57.250406 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 27 16:55:57 crc kubenswrapper[4707]: I1127 16:55:57.251959 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-api" containerID="cri-o://bb35b0cd5c08a1055926597c936749172940028c77bee5451b3745d104ddee9c" gracePeriod=30 Nov 27 16:55:57 crc kubenswrapper[4707]: I1127 16:55:57.252011 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-listener" containerID="cri-o://05935015aee9ab3d8dd8bfe4476e1c035ac1531e8805f0838ce1c3cbb25f9d4b" gracePeriod=30 Nov 27 16:55:57 crc kubenswrapper[4707]: I1127 16:55:57.252091 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-notifier" containerID="cri-o://ebf927ed0f86322c1361753cfb6f6b16260a8bb532d7a757c81ee6aa66be08c4" gracePeriod=30 Nov 27 16:55:57 crc kubenswrapper[4707]: I1127 16:55:57.252130 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-evaluator" containerID="cri-o://a0d426a992cc3358f4ccc3ed596463a566a7ba3b1e2b16bec89b5d273014e130" gracePeriod=30 Nov 27 16:55:58 crc kubenswrapper[4707]: I1127 16:55:58.248406 4707 generic.go:334] "Generic (PLEG): container finished" podID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerID="a0d426a992cc3358f4ccc3ed596463a566a7ba3b1e2b16bec89b5d273014e130" exitCode=0 Nov 27 16:55:58 crc kubenswrapper[4707]: I1127 16:55:58.248775 4707 generic.go:334] "Generic (PLEG): container finished" podID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerID="bb35b0cd5c08a1055926597c936749172940028c77bee5451b3745d104ddee9c" exitCode=0 Nov 27 16:55:58 crc kubenswrapper[4707]: I1127 16:55:58.248493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerDied","Data":"a0d426a992cc3358f4ccc3ed596463a566a7ba3b1e2b16bec89b5d273014e130"} Nov 27 16:55:58 crc kubenswrapper[4707]: I1127 16:55:58.248830 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerDied","Data":"bb35b0cd5c08a1055926597c936749172940028c77bee5451b3745d104ddee9c"} Nov 27 16:56:01 crc kubenswrapper[4707]: I1127 16:56:01.277952 4707 generic.go:334] "Generic (PLEG): container finished" podID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerID="05935015aee9ab3d8dd8bfe4476e1c035ac1531e8805f0838ce1c3cbb25f9d4b" exitCode=0 Nov 27 16:56:01 crc kubenswrapper[4707]: I1127 16:56:01.278039 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerDied","Data":"05935015aee9ab3d8dd8bfe4476e1c035ac1531e8805f0838ce1c3cbb25f9d4b"} Nov 27 16:56:05 crc kubenswrapper[4707]: I1127 16:56:05.324686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" event={"ID":"e96caaf7-dd4f-4734-8601-23d38df9005f","Type":"ContainerStarted","Data":"2213d1787568f2fabd631db03517856b86b8d80ed78a2d102f4ccd00c2ef83c8"} Nov 27 16:56:05 crc kubenswrapper[4707]: I1127 16:56:05.326640 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:56:05 crc kubenswrapper[4707]: I1127 16:56:05.351646 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" podStartSLOduration=2.728319053 podStartE2EDuration="35.351628266s" podCreationTimestamp="2025-11-27 16:55:30 +0000 UTC" firstStartedPulling="2025-11-27 16:55:31.919772298 +0000 UTC m=+3107.551221056" lastFinishedPulling="2025-11-27 16:56:04.543081501 +0000 UTC m=+3140.174530269" observedRunningTime="2025-11-27 16:56:05.349890123 +0000 UTC m=+3140.981338931" watchObservedRunningTime="2025-11-27 16:56:05.351628266 +0000 UTC m=+3140.983077034" Nov 27 16:56:05 crc kubenswrapper[4707]: I1127 16:56:05.357486 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-82wd5" Nov 27 16:56:06 crc kubenswrapper[4707]: I1127 16:56:06.195113 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:56:06 crc kubenswrapper[4707]: E1127 16:56:06.195621 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:56:07 crc kubenswrapper[4707]: I1127 16:56:07.343237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" event={"ID":"e1ab71b3-d476-4099-b4d8-c59b437dd4f7","Type":"ContainerStarted","Data":"ef9f84421396fe38bfe0f07211d4c912f1bc0ca052b1da33e11276068b6c342d"} Nov 27 16:56:07 crc kubenswrapper[4707]: I1127 16:56:07.368877 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz" podStartSLOduration=2.659034773 podStartE2EDuration="37.368858435s" podCreationTimestamp="2025-11-27 16:55:30 +0000 UTC" firstStartedPulling="2025-11-27 16:55:31.771564752 +0000 UTC m=+3107.403013520" lastFinishedPulling="2025-11-27 16:56:06.481388414 +0000 UTC m=+3142.112837182" observedRunningTime="2025-11-27 16:56:07.361063953 +0000 UTC m=+3142.992512721" watchObservedRunningTime="2025-11-27 16:56:07.368858435 +0000 UTC m=+3143.000307203" Nov 27 16:56:08 crc kubenswrapper[4707]: I1127 16:56:08.368341 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" event={"ID":"4208373c-9a43-406a-8262-0aff47b60f85","Type":"ContainerStarted","Data":"1d2a01dbb89bb6bc4035ed74103ccf20fc787601a891f4cc284e88cd0c1b3b84"} Nov 27 16:56:08 crc kubenswrapper[4707]: I1127 16:56:08.372715 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" event={"ID":"fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa","Type":"ContainerStarted","Data":"bf3fee7686abc485e5d37a1a3d0f2e2107707e1c1bde588e78563e5c3a76665a"} Nov 27 16:56:08 crc kubenswrapper[4707]: I1127 16:56:08.402568 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp" podStartSLOduration=-9223371998.452229 podStartE2EDuration="38.402547163s" podCreationTimestamp="2025-11-27 16:55:30 +0000 UTC" firstStartedPulling="2025-11-27 16:55:31.607080004 +0000 UTC m=+3107.238528772" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:56:08.392845154 +0000 UTC m=+3144.024293952" watchObservedRunningTime="2025-11-27 16:56:08.402547163 +0000 UTC m=+3144.033995941" Nov 27 16:56:08 crc kubenswrapper[4707]: I1127 16:56:08.444273 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-zwnr6" podStartSLOduration=2.7164275289999997 podStartE2EDuration="38.444251782s" podCreationTimestamp="2025-11-27 16:55:30 +0000 UTC" firstStartedPulling="2025-11-27 16:55:31.290003854 +0000 UTC m=+3106.921452612" lastFinishedPulling="2025-11-27 16:56:07.017828097 +0000 UTC m=+3142.649276865" observedRunningTime="2025-11-27 16:56:08.42025234 +0000 UTC m=+3144.051701128" watchObservedRunningTime="2025-11-27 16:56:08.444251782 +0000 UTC m=+3144.075700570" Nov 27 16:56:09 crc kubenswrapper[4707]: I1127 16:56:09.385281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" event={"ID":"0c162a13-e55c-44f0-9ab7-cc7ce6d87605","Type":"ContainerStarted","Data":"50f86c375ae2ab571bd1d01deb7b01d7a699eb42b3e89167b3c3b04b54feec85"} Nov 27 16:56:09 crc kubenswrapper[4707]: I1127 16:56:09.386226 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:56:09 crc kubenswrapper[4707]: I1127 16:56:09.407771 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" podStartSLOduration=2.987204168 podStartE2EDuration="39.407755949s" podCreationTimestamp="2025-11-27 16:55:30 +0000 UTC" firstStartedPulling="2025-11-27 16:55:31.890141057 +0000 UTC m=+3107.521589815" lastFinishedPulling="2025-11-27 16:56:08.310692828 +0000 UTC m=+3143.942141596" observedRunningTime="2025-11-27 16:56:09.404178081 +0000 UTC m=+3145.035626879" watchObservedRunningTime="2025-11-27 16:56:09.407755949 +0000 UTC m=+3145.039204717" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.034607 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.040044 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.042934 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-72djp" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.043654 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.044302 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.045217 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.047974 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.058263 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.084321 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.084393 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6ba41a2-cc99-4242-be96-b249bc657b2f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.084440 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.084455 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6ba41a2-cc99-4242-be96-b249bc657b2f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.084472 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.084526 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b6ba41a2-cc99-4242-be96-b249bc657b2f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.084599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l59bm\" (UniqueName: \"kubernetes.io/projected/b6ba41a2-cc99-4242-be96-b249bc657b2f-kube-api-access-l59bm\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.186411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6ba41a2-cc99-4242-be96-b249bc657b2f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.186491 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.186511 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6ba41a2-cc99-4242-be96-b249bc657b2f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.186539 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.186614 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b6ba41a2-cc99-4242-be96-b249bc657b2f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.186739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l59bm\" (UniqueName: \"kubernetes.io/projected/b6ba41a2-cc99-4242-be96-b249bc657b2f-kube-api-access-l59bm\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.186805 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.188873 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/b6ba41a2-cc99-4242-be96-b249bc657b2f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.195939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.196193 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b6ba41a2-cc99-4242-be96-b249bc657b2f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.197846 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.198872 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b6ba41a2-cc99-4242-be96-b249bc657b2f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.198993 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b6ba41a2-cc99-4242-be96-b249bc657b2f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.205502 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l59bm\" (UniqueName: \"kubernetes.io/projected/b6ba41a2-cc99-4242-be96-b249bc657b2f-kube-api-access-l59bm\") pod \"alertmanager-metric-storage-0\" (UID: \"b6ba41a2-cc99-4242-be96-b249bc657b2f\") " pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.358523 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:12 crc kubenswrapper[4707]: W1127 16:56:12.864097 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ba41a2_cc99_4242_be96_b249bc657b2f.slice/crio-d1eb16578ce1934b8557c32cc939e0f6342d905ea6fe58a615952ab86ac81b62 WatchSource:0}: Error finding container d1eb16578ce1934b8557c32cc939e0f6342d905ea6fe58a615952ab86ac81b62: Status 404 returned error can't find the container with id d1eb16578ce1934b8557c32cc939e0f6342d905ea6fe58a615952ab86ac81b62 Nov 27 16:56:12 crc kubenswrapper[4707]: I1127 16:56:12.867060 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.427619 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.432199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b6ba41a2-cc99-4242-be96-b249bc657b2f","Type":"ContainerStarted","Data":"d1eb16578ce1934b8557c32cc939e0f6342d905ea6fe58a615952ab86ac81b62"} Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.432326 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.436990 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.437061 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fmtlt" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.437405 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.437674 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.437894 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.440876 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.446639 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.512093 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.512227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.512285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-config\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.512345 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7pf8\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-kube-api-access-s7pf8\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.512431 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b4ac6b43-b646-4f4d-b04c-e911806728a8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.512508 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b4ac6b43-b646-4f4d-b04c-e911806728a8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.512544 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.512577 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.615080 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.615186 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-config\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.615243 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7pf8\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-kube-api-access-s7pf8\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.615330 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b4ac6b43-b646-4f4d-b04c-e911806728a8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.615415 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b4ac6b43-b646-4f4d-b04c-e911806728a8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.615493 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.615529 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.615578 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.616789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b4ac6b43-b646-4f4d-b04c-e911806728a8-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.620426 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.620626 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/38f6cd8bb482fd46a18189c3288fcbb9a3e0d2c6dfce600cb079b84af3ec796b/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.622688 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.622784 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.623471 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-config\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.628279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.632804 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b4ac6b43-b646-4f4d-b04c-e911806728a8-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.641229 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7pf8\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-kube-api-access-s7pf8\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.697766 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"prometheus-metric-storage-0\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:13 crc kubenswrapper[4707]: I1127 16:56:13.778951 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:14 crc kubenswrapper[4707]: I1127 16:56:14.320286 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:56:14 crc kubenswrapper[4707]: W1127 16:56:14.330076 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4ac6b43_b646_4f4d_b04c_e911806728a8.slice/crio-cb475767a9bfddb0bd8a767dc77ea48652bfbc1439614a4040d1c80cf3d46f57 WatchSource:0}: Error finding container cb475767a9bfddb0bd8a767dc77ea48652bfbc1439614a4040d1c80cf3d46f57: Status 404 returned error can't find the container with id cb475767a9bfddb0bd8a767dc77ea48652bfbc1439614a4040d1c80cf3d46f57 Nov 27 16:56:14 crc kubenswrapper[4707]: I1127 16:56:14.467678 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerStarted","Data":"cb475767a9bfddb0bd8a767dc77ea48652bfbc1439614a4040d1c80cf3d46f57"} Nov 27 16:56:18 crc kubenswrapper[4707]: I1127 16:56:18.195706 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:56:18 crc kubenswrapper[4707]: E1127 16:56:18.196817 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:56:20 crc kubenswrapper[4707]: I1127 16:56:20.551907 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b6ba41a2-cc99-4242-be96-b249bc657b2f","Type":"ContainerStarted","Data":"053d0443ca79c1225037a4b4b538359448b2858eacac4f0f8faa483410821af1"} Nov 27 16:56:20 crc kubenswrapper[4707]: I1127 16:56:20.553279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerStarted","Data":"a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d"} Nov 27 16:56:21 crc kubenswrapper[4707]: I1127 16:56:21.130354 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-jgbmm" Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.638491 4707 generic.go:334] "Generic (PLEG): container finished" podID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerID="ebf927ed0f86322c1361753cfb6f6b16260a8bb532d7a757c81ee6aa66be08c4" exitCode=137 Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.639084 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerDied","Data":"ebf927ed0f86322c1361753cfb6f6b16260a8bb532d7a757c81ee6aa66be08c4"} Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.644143 4707 generic.go:334] "Generic (PLEG): container finished" podID="b6ba41a2-cc99-4242-be96-b249bc657b2f" containerID="053d0443ca79c1225037a4b4b538359448b2858eacac4f0f8faa483410821af1" exitCode=0 Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.644205 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b6ba41a2-cc99-4242-be96-b249bc657b2f","Type":"ContainerDied","Data":"053d0443ca79c1225037a4b4b538359448b2858eacac4f0f8faa483410821af1"} Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.648393 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerID="a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d" exitCode=0 Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.648420 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerDied","Data":"a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d"} Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.880200 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.995932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq25m\" (UniqueName: \"kubernetes.io/projected/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-kube-api-access-tq25m\") pod \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.996001 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-config-data\") pod \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.996057 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-public-tls-certs\") pod \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.996096 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-internal-tls-certs\") pod \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.996272 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-scripts\") pod \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " Nov 27 16:56:27 crc kubenswrapper[4707]: I1127 16:56:27.996327 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-combined-ca-bundle\") pod \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\" (UID: \"3794dd05-6c5a-43bc-8a93-b8b841b05bb2\") " Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.013706 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-kube-api-access-tq25m" (OuterVolumeSpecName: "kube-api-access-tq25m") pod "3794dd05-6c5a-43bc-8a93-b8b841b05bb2" (UID: "3794dd05-6c5a-43bc-8a93-b8b841b05bb2"). InnerVolumeSpecName "kube-api-access-tq25m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.022553 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-scripts" (OuterVolumeSpecName: "scripts") pod "3794dd05-6c5a-43bc-8a93-b8b841b05bb2" (UID: "3794dd05-6c5a-43bc-8a93-b8b841b05bb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.099100 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.099128 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq25m\" (UniqueName: \"kubernetes.io/projected/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-kube-api-access-tq25m\") on node \"crc\" DevicePath \"\"" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.160082 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3794dd05-6c5a-43bc-8a93-b8b841b05bb2" (UID: "3794dd05-6c5a-43bc-8a93-b8b841b05bb2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.174802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3794dd05-6c5a-43bc-8a93-b8b841b05bb2" (UID: "3794dd05-6c5a-43bc-8a93-b8b841b05bb2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.200522 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.200767 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.237458 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-config-data" (OuterVolumeSpecName: "config-data") pod "3794dd05-6c5a-43bc-8a93-b8b841b05bb2" (UID: "3794dd05-6c5a-43bc-8a93-b8b841b05bb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.272056 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3794dd05-6c5a-43bc-8a93-b8b841b05bb2" (UID: "3794dd05-6c5a-43bc-8a93-b8b841b05bb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.302802 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.303096 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3794dd05-6c5a-43bc-8a93-b8b841b05bb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.660442 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3794dd05-6c5a-43bc-8a93-b8b841b05bb2","Type":"ContainerDied","Data":"60645f07d0c8b4508dae87a372c98e826e694278ce8b3af733a1c724b5c2cfb4"} Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.660506 4707 scope.go:117] "RemoveContainer" containerID="05935015aee9ab3d8dd8bfe4476e1c035ac1531e8805f0838ce1c3cbb25f9d4b" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.660566 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.686150 4707 scope.go:117] "RemoveContainer" containerID="ebf927ed0f86322c1361753cfb6f6b16260a8bb532d7a757c81ee6aa66be08c4" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.705840 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.720774 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.731337 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 27 16:56:28 crc kubenswrapper[4707]: E1127 16:56:28.731745 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-notifier" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.731761 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-notifier" Nov 27 16:56:28 crc kubenswrapper[4707]: E1127 16:56:28.731777 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-evaluator" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.731783 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-evaluator" Nov 27 16:56:28 crc kubenswrapper[4707]: E1127 16:56:28.731805 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-api" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.731811 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-api" Nov 27 16:56:28 crc kubenswrapper[4707]: E1127 16:56:28.731831 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-listener" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.731836 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-listener" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.731999 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-api" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.732013 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-notifier" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.732024 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-listener" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.732052 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" containerName="aodh-evaluator" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.738573 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.742796 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.742994 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pfvsr" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.743100 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.743907 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.747303 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.753017 4707 scope.go:117] "RemoveContainer" containerID="a0d426a992cc3358f4ccc3ed596463a566a7ba3b1e2b16bec89b5d273014e130" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.760087 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.861046 4707 scope.go:117] "RemoveContainer" containerID="bb35b0cd5c08a1055926597c936749172940028c77bee5451b3745d104ddee9c" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.914578 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-config-data\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.914628 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-public-tls-certs\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.914654 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8ms\" (UniqueName: \"kubernetes.io/projected/6b244f80-b143-4626-a0ac-9cfe02a60f7f-kube-api-access-sj8ms\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.914922 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-internal-tls-certs\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.914975 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-scripts\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:28 crc kubenswrapper[4707]: I1127 16:56:28.915028 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.016810 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-internal-tls-certs\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.017094 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-scripts\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.017197 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.017411 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-config-data\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.017520 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-public-tls-certs\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.017634 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8ms\" (UniqueName: \"kubernetes.io/projected/6b244f80-b143-4626-a0ac-9cfe02a60f7f-kube-api-access-sj8ms\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.021803 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-internal-tls-certs\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.021998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-scripts\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.022695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.022971 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-config-data\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.029756 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-public-tls-certs\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.039493 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8ms\" (UniqueName: \"kubernetes.io/projected/6b244f80-b143-4626-a0ac-9cfe02a60f7f-kube-api-access-sj8ms\") pod \"aodh-0\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.076258 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.206801 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3794dd05-6c5a-43bc-8a93-b8b841b05bb2" path="/var/lib/kubelet/pods/3794dd05-6c5a-43bc-8a93-b8b841b05bb2/volumes" Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.633890 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 16:56:29 crc kubenswrapper[4707]: W1127 16:56:29.649042 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b244f80_b143_4626_a0ac_9cfe02a60f7f.slice/crio-f405e16e80572f93a889892a0182b0ba6eef4fb58c8b4d7282ed27f5e806fd1b WatchSource:0}: Error finding container f405e16e80572f93a889892a0182b0ba6eef4fb58c8b4d7282ed27f5e806fd1b: Status 404 returned error can't find the container with id f405e16e80572f93a889892a0182b0ba6eef4fb58c8b4d7282ed27f5e806fd1b Nov 27 16:56:29 crc kubenswrapper[4707]: I1127 16:56:29.679901 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerStarted","Data":"f405e16e80572f93a889892a0182b0ba6eef4fb58c8b4d7282ed27f5e806fd1b"} Nov 27 16:56:31 crc kubenswrapper[4707]: I1127 16:56:31.197350 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:56:31 crc kubenswrapper[4707]: E1127 16:56:31.198065 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:56:32 crc kubenswrapper[4707]: I1127 16:56:32.714954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerStarted","Data":"19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64"} Nov 27 16:56:32 crc kubenswrapper[4707]: I1127 16:56:32.717797 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b6ba41a2-cc99-4242-be96-b249bc657b2f","Type":"ContainerStarted","Data":"6ecbc8c0a7d21aa46cf0c64746163a8e00dcb8dc37f0fb376149239f973b2b66"} Nov 27 16:56:34 crc kubenswrapper[4707]: I1127 16:56:34.752199 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerStarted","Data":"912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651"} Nov 27 16:56:34 crc kubenswrapper[4707]: I1127 16:56:34.756158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerStarted","Data":"edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02"} Nov 27 16:56:35 crc kubenswrapper[4707]: I1127 16:56:35.787633 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerStarted","Data":"8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53"} Nov 27 16:56:35 crc kubenswrapper[4707]: I1127 16:56:35.791840 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"b6ba41a2-cc99-4242-be96-b249bc657b2f","Type":"ContainerStarted","Data":"25ff2979e77c3bb952ad370bcddd814c88bcd17b7f4a3b36c7e5604266e835f4"} Nov 27 16:56:35 crc kubenswrapper[4707]: I1127 16:56:35.796216 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:35 crc kubenswrapper[4707]: I1127 16:56:35.799177 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Nov 27 16:56:35 crc kubenswrapper[4707]: I1127 16:56:35.835554 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.309467892 podStartE2EDuration="23.835534017s" podCreationTimestamp="2025-11-27 16:56:12 +0000 UTC" firstStartedPulling="2025-11-27 16:56:12.866815834 +0000 UTC m=+3148.498264622" lastFinishedPulling="2025-11-27 16:56:31.392881969 +0000 UTC m=+3167.024330747" observedRunningTime="2025-11-27 16:56:35.834267416 +0000 UTC m=+3171.465716194" watchObservedRunningTime="2025-11-27 16:56:35.835534017 +0000 UTC m=+3171.466982825" Nov 27 16:56:36 crc kubenswrapper[4707]: I1127 16:56:36.802346 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerStarted","Data":"e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828"} Nov 27 16:56:36 crc kubenswrapper[4707]: I1127 16:56:36.826087 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.341541946 podStartE2EDuration="8.82606981s" podCreationTimestamp="2025-11-27 16:56:28 +0000 UTC" firstStartedPulling="2025-11-27 16:56:29.652226962 +0000 UTC m=+3165.283675730" lastFinishedPulling="2025-11-27 16:56:36.136754826 +0000 UTC m=+3171.768203594" observedRunningTime="2025-11-27 16:56:36.823744492 +0000 UTC m=+3172.455193270" watchObservedRunningTime="2025-11-27 16:56:36.82606981 +0000 UTC m=+3172.457518578" Nov 27 16:56:37 crc kubenswrapper[4707]: I1127 16:56:37.817991 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerStarted","Data":"3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a"} Nov 27 16:56:43 crc kubenswrapper[4707]: I1127 16:56:43.900078 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerStarted","Data":"1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b"} Nov 27 16:56:43 crc kubenswrapper[4707]: I1127 16:56:43.937330 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.088917378 podStartE2EDuration="31.937310573s" podCreationTimestamp="2025-11-27 16:56:12 +0000 UTC" firstStartedPulling="2025-11-27 16:56:14.331679419 +0000 UTC m=+3149.963128187" lastFinishedPulling="2025-11-27 16:56:43.180072574 +0000 UTC m=+3178.811521382" observedRunningTime="2025-11-27 16:56:43.931943841 +0000 UTC m=+3179.563392629" watchObservedRunningTime="2025-11-27 16:56:43.937310573 +0000 UTC m=+3179.568759341" Nov 27 16:56:44 crc kubenswrapper[4707]: I1127 16:56:44.198868 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:56:44 crc kubenswrapper[4707]: E1127 16:56:44.199269 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:56:48 crc kubenswrapper[4707]: I1127 16:56:48.779246 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:55 crc kubenswrapper[4707]: I1127 16:56:55.210271 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:56:55 crc kubenswrapper[4707]: E1127 16:56:55.211549 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:56:58 crc kubenswrapper[4707]: I1127 16:56:58.780214 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:58 crc kubenswrapper[4707]: I1127 16:56:58.782481 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 27 16:56:59 crc kubenswrapper[4707]: I1127 16:56:59.082691 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.548800 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.549406 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f46a6b5b-2709-4fe9-8db4-cb2df241d728" containerName="openstackclient" containerID="cri-o://a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954" gracePeriod=2 Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.565053 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.592205 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 27 16:57:00 crc kubenswrapper[4707]: E1127 16:57:00.592767 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46a6b5b-2709-4fe9-8db4-cb2df241d728" containerName="openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.592786 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46a6b5b-2709-4fe9-8db4-cb2df241d728" containerName="openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.593012 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46a6b5b-2709-4fe9-8db4-cb2df241d728" containerName="openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.593764 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.618708 4707 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f46a6b5b-2709-4fe9-8db4-cb2df241d728" podUID="247a20f8-a665-4383-944a-6fe111045aa1" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.620025 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.665174 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/247a20f8-a665-4383-944a-6fe111045aa1-openstack-config\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.665355 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/247a20f8-a665-4383-944a-6fe111045aa1-openstack-config-secret\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.665447 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247a20f8-a665-4383-944a-6fe111045aa1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.665659 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7q5n\" (UniqueName: \"kubernetes.io/projected/247a20f8-a665-4383-944a-6fe111045aa1-kube-api-access-t7q5n\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.767892 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/247a20f8-a665-4383-944a-6fe111045aa1-openstack-config-secret\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.768217 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247a20f8-a665-4383-944a-6fe111045aa1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.768448 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7q5n\" (UniqueName: \"kubernetes.io/projected/247a20f8-a665-4383-944a-6fe111045aa1-kube-api-access-t7q5n\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.768608 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/247a20f8-a665-4383-944a-6fe111045aa1-openstack-config\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.769707 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/247a20f8-a665-4383-944a-6fe111045aa1-openstack-config\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.775998 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247a20f8-a665-4383-944a-6fe111045aa1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.784240 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/247a20f8-a665-4383-944a-6fe111045aa1-openstack-config-secret\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.786575 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7q5n\" (UniqueName: \"kubernetes.io/projected/247a20f8-a665-4383-944a-6fe111045aa1-kube-api-access-t7q5n\") pod \"openstackclient\" (UID: \"247a20f8-a665-4383-944a-6fe111045aa1\") " pod="openstack/openstackclient" Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.821592 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.821939 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-api" containerID="cri-o://19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64" gracePeriod=30 Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.821998 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-listener" containerID="cri-o://e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828" gracePeriod=30 Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.822070 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-evaluator" containerID="cri-o://912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651" gracePeriod=30 Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.822124 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-notifier" containerID="cri-o://8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53" gracePeriod=30 Nov 27 16:57:00 crc kubenswrapper[4707]: I1127 16:57:00.924119 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:57:01 crc kubenswrapper[4707]: I1127 16:57:01.123723 4707 generic.go:334] "Generic (PLEG): container finished" podID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerID="19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64" exitCode=0 Nov 27 16:57:01 crc kubenswrapper[4707]: I1127 16:57:01.123877 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerDied","Data":"19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64"} Nov 27 16:57:01 crc kubenswrapper[4707]: I1127 16:57:01.447538 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 16:57:01 crc kubenswrapper[4707]: I1127 16:57:01.922064 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:57:01 crc kubenswrapper[4707]: I1127 16:57:01.922705 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="prometheus" containerID="cri-o://edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02" gracePeriod=600 Nov 27 16:57:01 crc kubenswrapper[4707]: I1127 16:57:01.922795 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="thanos-sidecar" containerID="cri-o://1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b" gracePeriod=600 Nov 27 16:57:01 crc kubenswrapper[4707]: I1127 16:57:01.922929 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="config-reloader" containerID="cri-o://3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a" gracePeriod=600 Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.136053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"247a20f8-a665-4383-944a-6fe111045aa1","Type":"ContainerStarted","Data":"32d8a52967861a7ed2bf3aed8cf7afed7f9bd8989612aa5c062d0493624047f8"} Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.136103 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"247a20f8-a665-4383-944a-6fe111045aa1","Type":"ContainerStarted","Data":"a39466a3659806cb942128ba4c42da8861e833b24bad9eebcb68492c70bce38d"} Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.143179 4707 generic.go:334] "Generic (PLEG): container finished" podID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerID="912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651" exitCode=0 Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.143280 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerDied","Data":"912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651"} Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.147675 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerID="1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b" exitCode=0 Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.147704 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerID="edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02" exitCode=0 Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.147727 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerDied","Data":"1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b"} Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.147752 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerDied","Data":"edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02"} Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.159050 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.15902614 podStartE2EDuration="2.15902614s" podCreationTimestamp="2025-11-27 16:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:57:02.153184786 +0000 UTC m=+3197.784633554" watchObservedRunningTime="2025-11-27 16:57:02.15902614 +0000 UTC m=+3197.790474908" Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.981148 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:57:02 crc kubenswrapper[4707]: I1127 16:57:02.989765 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.122585 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-thanos-prometheus-http-client-file\") pod \"b4ac6b43-b646-4f4d-b04c-e911806728a8\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.122893 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"b4ac6b43-b646-4f4d-b04c-e911806728a8\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.122952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-web-config\") pod \"b4ac6b43-b646-4f4d-b04c-e911806728a8\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123017 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-combined-ca-bundle\") pod \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123046 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config\") pod \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123124 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-config\") pod \"b4ac6b43-b646-4f4d-b04c-e911806728a8\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123168 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config-secret\") pod \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123230 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b4ac6b43-b646-4f4d-b04c-e911806728a8-prometheus-metric-storage-rulefiles-0\") pod \"b4ac6b43-b646-4f4d-b04c-e911806728a8\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123280 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b4ac6b43-b646-4f4d-b04c-e911806728a8-config-out\") pod \"b4ac6b43-b646-4f4d-b04c-e911806728a8\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123307 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2c7r\" (UniqueName: \"kubernetes.io/projected/f46a6b5b-2709-4fe9-8db4-cb2df241d728-kube-api-access-s2c7r\") pod \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\" (UID: \"f46a6b5b-2709-4fe9-8db4-cb2df241d728\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123343 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7pf8\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-kube-api-access-s7pf8\") pod \"b4ac6b43-b646-4f4d-b04c-e911806728a8\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.123442 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-tls-assets\") pod \"b4ac6b43-b646-4f4d-b04c-e911806728a8\" (UID: \"b4ac6b43-b646-4f4d-b04c-e911806728a8\") " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.129980 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46a6b5b-2709-4fe9-8db4-cb2df241d728-kube-api-access-s2c7r" (OuterVolumeSpecName: "kube-api-access-s2c7r") pod "f46a6b5b-2709-4fe9-8db4-cb2df241d728" (UID: "f46a6b5b-2709-4fe9-8db4-cb2df241d728"). InnerVolumeSpecName "kube-api-access-s2c7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.130201 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "b4ac6b43-b646-4f4d-b04c-e911806728a8" (UID: "b4ac6b43-b646-4f4d-b04c-e911806728a8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.134148 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ac6b43-b646-4f4d-b04c-e911806728a8-config-out" (OuterVolumeSpecName: "config-out") pod "b4ac6b43-b646-4f4d-b04c-e911806728a8" (UID: "b4ac6b43-b646-4f4d-b04c-e911806728a8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.134356 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ac6b43-b646-4f4d-b04c-e911806728a8-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "b4ac6b43-b646-4f4d-b04c-e911806728a8" (UID: "b4ac6b43-b646-4f4d-b04c-e911806728a8"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.140352 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-kube-api-access-s7pf8" (OuterVolumeSpecName: "kube-api-access-s7pf8") pod "b4ac6b43-b646-4f4d-b04c-e911806728a8" (UID: "b4ac6b43-b646-4f4d-b04c-e911806728a8"). InnerVolumeSpecName "kube-api-access-s7pf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.141772 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-config" (OuterVolumeSpecName: "config") pod "b4ac6b43-b646-4f4d-b04c-e911806728a8" (UID: "b4ac6b43-b646-4f4d-b04c-e911806728a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.155214 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "b4ac6b43-b646-4f4d-b04c-e911806728a8" (UID: "b4ac6b43-b646-4f4d-b04c-e911806728a8"). InnerVolumeSpecName "pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.157767 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f46a6b5b-2709-4fe9-8db4-cb2df241d728" (UID: "f46a6b5b-2709-4fe9-8db4-cb2df241d728"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.158555 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b4ac6b43-b646-4f4d-b04c-e911806728a8" (UID: "b4ac6b43-b646-4f4d-b04c-e911806728a8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.161529 4707 generic.go:334] "Generic (PLEG): container finished" podID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerID="e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828" exitCode=0 Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.161745 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerDied","Data":"e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828"} Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.165635 4707 generic.go:334] "Generic (PLEG): container finished" podID="f46a6b5b-2709-4fe9-8db4-cb2df241d728" containerID="a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954" exitCode=137 Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.165713 4707 scope.go:117] "RemoveContainer" containerID="a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.165715 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.169899 4707 generic.go:334] "Generic (PLEG): container finished" podID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerID="3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a" exitCode=0 Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.171786 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.173161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerDied","Data":"3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a"} Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.173281 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b4ac6b43-b646-4f4d-b04c-e911806728a8","Type":"ContainerDied","Data":"cb475767a9bfddb0bd8a767dc77ea48652bfbc1439614a4040d1c80cf3d46f57"} Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.173717 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f46a6b5b-2709-4fe9-8db4-cb2df241d728" (UID: "f46a6b5b-2709-4fe9-8db4-cb2df241d728"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.190491 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-web-config" (OuterVolumeSpecName: "web-config") pod "b4ac6b43-b646-4f4d-b04c-e911806728a8" (UID: "b4ac6b43-b646-4f4d-b04c-e911806728a8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.208563 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f46a6b5b-2709-4fe9-8db4-cb2df241d728" (UID: "f46a6b5b-2709-4fe9-8db4-cb2df241d728"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.226576 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b4ac6b43-b646-4f4d-b04c-e911806728a8-config-out\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227096 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2c7r\" (UniqueName: \"kubernetes.io/projected/f46a6b5b-2709-4fe9-8db4-cb2df241d728-kube-api-access-s2c7r\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227206 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7pf8\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-kube-api-access-s7pf8\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227274 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b4ac6b43-b646-4f4d-b04c-e911806728a8-tls-assets\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227339 4707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227444 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") on node \"crc\" " Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227515 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-web-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227596 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227671 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227793 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4ac6b43-b646-4f4d-b04c-e911806728a8-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.227871 4707 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f46a6b5b-2709-4fe9-8db4-cb2df241d728-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.228237 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b4ac6b43-b646-4f4d-b04c-e911806728a8-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.257553 4707 scope.go:117] "RemoveContainer" containerID="a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954" Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.258622 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954\": container with ID starting with a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954 not found: ID does not exist" containerID="a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.258665 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954"} err="failed to get container status \"a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954\": rpc error: code = NotFound desc = could not find container \"a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954\": container with ID starting with a3f01a0e3d9f66af16abea35fc61876f181ebf7ed29257bc884b9eb8280bf954 not found: ID does not exist" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.258692 4707 scope.go:117] "RemoveContainer" containerID="1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.259747 4707 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.259925 4707 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09") on node "crc" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.283088 4707 scope.go:117] "RemoveContainer" containerID="3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.317613 4707 scope.go:117] "RemoveContainer" containerID="edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.329804 4707 reconciler_common.go:293] "Volume detached for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.345825 4707 scope.go:117] "RemoveContainer" containerID="a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.368745 4707 scope.go:117] "RemoveContainer" containerID="1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b" Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.369272 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b\": container with ID starting with 1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b not found: ID does not exist" containerID="1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.369313 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b"} err="failed to get container status \"1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b\": rpc error: code = NotFound desc = could not find container \"1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b\": container with ID starting with 1e446e1af922e8ecdca055718cd94df75eea032305585706247432e3ba5ecb3b not found: ID does not exist" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.369343 4707 scope.go:117] "RemoveContainer" containerID="3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a" Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.369587 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a\": container with ID starting with 3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a not found: ID does not exist" containerID="3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.369636 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a"} err="failed to get container status \"3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a\": rpc error: code = NotFound desc = could not find container \"3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a\": container with ID starting with 3cf075c50b5b17c0e7cd5e4b018e54ab2f1c85f164292e02b46902aaf179ab2a not found: ID does not exist" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.369655 4707 scope.go:117] "RemoveContainer" containerID="edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02" Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.370157 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02\": container with ID starting with edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02 not found: ID does not exist" containerID="edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.370189 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02"} err="failed to get container status \"edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02\": rpc error: code = NotFound desc = could not find container \"edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02\": container with ID starting with edde3022a865a8d4105b2b05c9342e801760cf5c52115db312fbaa3a09f06a02 not found: ID does not exist" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.370206 4707 scope.go:117] "RemoveContainer" containerID="a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d" Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.370452 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d\": container with ID starting with a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d not found: ID does not exist" containerID="a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.370500 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d"} err="failed to get container status \"a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d\": rpc error: code = NotFound desc = could not find container \"a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d\": container with ID starting with a1afff1d55478d3ed92746e8f3ccbccfdab25f9adb3d5128b738e7a70e60b55d not found: ID does not exist" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.500165 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.510276 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.525480 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.525962 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="thanos-sidecar" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.525981 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="thanos-sidecar" Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.525996 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="init-config-reloader" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.526004 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="init-config-reloader" Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.526032 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="prometheus" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.526040 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="prometheus" Nov 27 16:57:03 crc kubenswrapper[4707]: E1127 16:57:03.526055 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="config-reloader" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.526065 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="config-reloader" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.526270 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="thanos-sidecar" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.526315 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="config-reloader" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.526335 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" containerName="prometheus" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.528567 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.530589 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.530890 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.531221 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fmtlt" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.531410 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.531533 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.534663 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.537563 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.542258 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.635704 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.635828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5g8g\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-kube-api-access-v5g8g\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.635864 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.635924 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.635965 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.635990 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.636048 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.636089 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.636119 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.636162 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8635e90e-4a8c-4332-ad1c-1d46f8a15834-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.636192 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.738069 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5g8g\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-kube-api-access-v5g8g\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.738108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.738158 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.738188 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.738205 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.738885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.738939 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.738976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.739016 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8635e90e-4a8c-4332-ad1c-1d46f8a15834-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.739044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.739091 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.739927 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8635e90e-4a8c-4332-ad1c-1d46f8a15834-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.742489 4707 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.742524 4707 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/38f6cd8bb482fd46a18189c3288fcbb9a3e0d2c6dfce600cb079b84af3ec796b/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.742825 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.742888 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.744012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.745176 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.745209 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.746475 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.746789 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.748652 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.758357 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5g8g\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-kube-api-access-v5g8g\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.794343 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"prometheus-metric-storage-0\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:03 crc kubenswrapper[4707]: I1127 16:57:03.870103 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:04 crc kubenswrapper[4707]: I1127 16:57:04.334991 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:57:05 crc kubenswrapper[4707]: I1127 16:57:05.206979 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ac6b43-b646-4f4d-b04c-e911806728a8" path="/var/lib/kubelet/pods/b4ac6b43-b646-4f4d-b04c-e911806728a8/volumes" Nov 27 16:57:05 crc kubenswrapper[4707]: I1127 16:57:05.208422 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46a6b5b-2709-4fe9-8db4-cb2df241d728" path="/var/lib/kubelet/pods/f46a6b5b-2709-4fe9-8db4-cb2df241d728/volumes" Nov 27 16:57:05 crc kubenswrapper[4707]: I1127 16:57:05.209690 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerStarted","Data":"7f5448391d60bb775b2682bf8bbba49e98ac8af80f01cbc6fcd654b0737ac5ea"} Nov 27 16:57:08 crc kubenswrapper[4707]: I1127 16:57:08.226514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerStarted","Data":"a578aa3565790a0799de06a734dcd9d99071d9b5e1f1006addbd6b20994966fc"} Nov 27 16:57:10 crc kubenswrapper[4707]: I1127 16:57:10.195478 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:57:10 crc kubenswrapper[4707]: E1127 16:57:10.196340 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:57:15 crc kubenswrapper[4707]: I1127 16:57:15.303218 4707 generic.go:334] "Generic (PLEG): container finished" podID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerID="a578aa3565790a0799de06a734dcd9d99071d9b5e1f1006addbd6b20994966fc" exitCode=0 Nov 27 16:57:15 crc kubenswrapper[4707]: I1127 16:57:15.303478 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerDied","Data":"a578aa3565790a0799de06a734dcd9d99071d9b5e1f1006addbd6b20994966fc"} Nov 27 16:57:16 crc kubenswrapper[4707]: I1127 16:57:16.318313 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerStarted","Data":"fbf62ab5cbe7e4b868486610d78d9a511e46167fd1a60ccb688d1e4be54f79ba"} Nov 27 16:57:20 crc kubenswrapper[4707]: I1127 16:57:20.380446 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerStarted","Data":"8cb294eb708386b32b85e50ab0c3bf0011f9edfb03d1d38e5d3e1cc2e614cd76"} Nov 27 16:57:20 crc kubenswrapper[4707]: I1127 16:57:20.380862 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerStarted","Data":"f64d1db92398691d7a8708ca36a72654489b8a39e1bf795d4684d045bc67c7e5"} Nov 27 16:57:20 crc kubenswrapper[4707]: I1127 16:57:20.433260 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.433180731 podStartE2EDuration="17.433180731s" podCreationTimestamp="2025-11-27 16:57:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:57:20.425011469 +0000 UTC m=+3216.056460317" watchObservedRunningTime="2025-11-27 16:57:20.433180731 +0000 UTC m=+3216.064629509" Nov 27 16:57:23 crc kubenswrapper[4707]: I1127 16:57:23.870703 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:24 crc kubenswrapper[4707]: I1127 16:57:24.195288 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:57:24 crc kubenswrapper[4707]: E1127 16:57:24.195660 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.270514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.452856 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-config-data\") pod \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.453252 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-combined-ca-bundle\") pod \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.453291 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-scripts\") pod \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.453427 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-internal-tls-certs\") pod \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.453473 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj8ms\" (UniqueName: \"kubernetes.io/projected/6b244f80-b143-4626-a0ac-9cfe02a60f7f-kube-api-access-sj8ms\") pod \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.453581 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-public-tls-certs\") pod \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\" (UID: \"6b244f80-b143-4626-a0ac-9cfe02a60f7f\") " Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.458976 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b244f80-b143-4626-a0ac-9cfe02a60f7f-kube-api-access-sj8ms" (OuterVolumeSpecName: "kube-api-access-sj8ms") pod "6b244f80-b143-4626-a0ac-9cfe02a60f7f" (UID: "6b244f80-b143-4626-a0ac-9cfe02a60f7f"). InnerVolumeSpecName "kube-api-access-sj8ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.471693 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-scripts" (OuterVolumeSpecName: "scripts") pod "6b244f80-b143-4626-a0ac-9cfe02a60f7f" (UID: "6b244f80-b143-4626-a0ac-9cfe02a60f7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.515417 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6b244f80-b143-4626-a0ac-9cfe02a60f7f" (UID: "6b244f80-b143-4626-a0ac-9cfe02a60f7f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.517238 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6b244f80-b143-4626-a0ac-9cfe02a60f7f" (UID: "6b244f80-b143-4626-a0ac-9cfe02a60f7f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.528125 4707 generic.go:334] "Generic (PLEG): container finished" podID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerID="8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53" exitCode=137 Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.528190 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerDied","Data":"8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53"} Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.528222 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6b244f80-b143-4626-a0ac-9cfe02a60f7f","Type":"ContainerDied","Data":"f405e16e80572f93a889892a0182b0ba6eef4fb58c8b4d7282ed27f5e806fd1b"} Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.528242 4707 scope.go:117] "RemoveContainer" containerID="e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.528437 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.556595 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.556844 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.556907 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj8ms\" (UniqueName: \"kubernetes.io/projected/6b244f80-b143-4626-a0ac-9cfe02a60f7f-kube-api-access-sj8ms\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.556988 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.570360 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b244f80-b143-4626-a0ac-9cfe02a60f7f" (UID: "6b244f80-b143-4626-a0ac-9cfe02a60f7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.598926 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-config-data" (OuterVolumeSpecName: "config-data") pod "6b244f80-b143-4626-a0ac-9cfe02a60f7f" (UID: "6b244f80-b143-4626-a0ac-9cfe02a60f7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.599339 4707 scope.go:117] "RemoveContainer" containerID="8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.617943 4707 scope.go:117] "RemoveContainer" containerID="912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.637144 4707 scope.go:117] "RemoveContainer" containerID="19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.659356 4707 scope.go:117] "RemoveContainer" containerID="e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.660049 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.660078 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b244f80-b143-4626-a0ac-9cfe02a60f7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:57:31 crc kubenswrapper[4707]: E1127 16:57:31.660070 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828\": container with ID starting with e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828 not found: ID does not exist" containerID="e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.660135 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828"} err="failed to get container status \"e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828\": rpc error: code = NotFound desc = could not find container \"e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828\": container with ID starting with e6c08fd6301952622e3d04cac64e0d57da7e473e25a4d74a46cba8231bd69828 not found: ID does not exist" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.660177 4707 scope.go:117] "RemoveContainer" containerID="8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53" Nov 27 16:57:31 crc kubenswrapper[4707]: E1127 16:57:31.660698 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53\": container with ID starting with 8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53 not found: ID does not exist" containerID="8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.660753 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53"} err="failed to get container status \"8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53\": rpc error: code = NotFound desc = could not find container \"8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53\": container with ID starting with 8d9f0333933b3a3451a35a04f25758a127c22861a46d33944526705f9eb88c53 not found: ID does not exist" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.660795 4707 scope.go:117] "RemoveContainer" containerID="912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651" Nov 27 16:57:31 crc kubenswrapper[4707]: E1127 16:57:31.661110 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651\": container with ID starting with 912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651 not found: ID does not exist" containerID="912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.661141 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651"} err="failed to get container status \"912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651\": rpc error: code = NotFound desc = could not find container \"912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651\": container with ID starting with 912810a4fdbdf459530f3d976da5d91520e9c52fc782fc8fa906be72442c9651 not found: ID does not exist" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.661163 4707 scope.go:117] "RemoveContainer" containerID="19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64" Nov 27 16:57:31 crc kubenswrapper[4707]: E1127 16:57:31.661568 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64\": container with ID starting with 19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64 not found: ID does not exist" containerID="19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.661600 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64"} err="failed to get container status \"19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64\": rpc error: code = NotFound desc = could not find container \"19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64\": container with ID starting with 19161a77a990c54e1429cdabebed9be773526d73c961348c5123108a9f5c4e64 not found: ID does not exist" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.883786 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.898127 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.912727 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 27 16:57:31 crc kubenswrapper[4707]: E1127 16:57:31.913230 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-listener" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.913242 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-listener" Nov 27 16:57:31 crc kubenswrapper[4707]: E1127 16:57:31.913263 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-api" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.913269 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-api" Nov 27 16:57:31 crc kubenswrapper[4707]: E1127 16:57:31.913289 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-notifier" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.913295 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-notifier" Nov 27 16:57:31 crc kubenswrapper[4707]: E1127 16:57:31.913311 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-evaluator" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.913316 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-evaluator" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.913500 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-api" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.913515 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-notifier" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.913538 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-listener" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.913559 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" containerName="aodh-evaluator" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.915666 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.922565 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.925452 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pfvsr" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.925503 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.925633 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.925735 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.925934 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.966493 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-config-data\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.966572 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-public-tls-certs\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.966753 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-scripts\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.966890 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-internal-tls-certs\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.966928 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:31 crc kubenswrapper[4707]: I1127 16:57:31.966944 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsvrm\" (UniqueName: \"kubernetes.io/projected/6e2538ea-8394-4e99-8f0c-74895d703440-kube-api-access-fsvrm\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.067874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-config-data\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.068345 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-public-tls-certs\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.068499 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-scripts\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.068641 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-internal-tls-certs\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.068730 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.068798 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsvrm\" (UniqueName: \"kubernetes.io/projected/6e2538ea-8394-4e99-8f0c-74895d703440-kube-api-access-fsvrm\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.071901 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-scripts\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.072008 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-public-tls-certs\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.072576 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-config-data\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.072747 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.073070 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-internal-tls-certs\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.085406 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsvrm\" (UniqueName: \"kubernetes.io/projected/6e2538ea-8394-4e99-8f0c-74895d703440-kube-api-access-fsvrm\") pod \"aodh-0\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.240932 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 16:57:32 crc kubenswrapper[4707]: W1127 16:57:32.692766 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e2538ea_8394_4e99_8f0c_74895d703440.slice/crio-2ad324d6c714e839d6623c90aa798ae2c38c69620d5e877815813f20e80770cf WatchSource:0}: Error finding container 2ad324d6c714e839d6623c90aa798ae2c38c69620d5e877815813f20e80770cf: Status 404 returned error can't find the container with id 2ad324d6c714e839d6623c90aa798ae2c38c69620d5e877815813f20e80770cf Nov 27 16:57:32 crc kubenswrapper[4707]: I1127 16:57:32.700057 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 16:57:33 crc kubenswrapper[4707]: I1127 16:57:33.217593 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b244f80-b143-4626-a0ac-9cfe02a60f7f" path="/var/lib/kubelet/pods/6b244f80-b143-4626-a0ac-9cfe02a60f7f/volumes" Nov 27 16:57:33 crc kubenswrapper[4707]: I1127 16:57:33.568524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerStarted","Data":"2ad324d6c714e839d6623c90aa798ae2c38c69620d5e877815813f20e80770cf"} Nov 27 16:57:33 crc kubenswrapper[4707]: I1127 16:57:33.871612 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:33 crc kubenswrapper[4707]: I1127 16:57:33.879359 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:34 crc kubenswrapper[4707]: I1127 16:57:34.586731 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerStarted","Data":"bbe473e6bba4d6205f391e056ba7ee318c81167278b2c8acc0321def0d04bf7b"} Nov 27 16:57:34 crc kubenswrapper[4707]: I1127 16:57:34.594525 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 27 16:57:35 crc kubenswrapper[4707]: I1127 16:57:35.598251 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerStarted","Data":"c4b9fbb6df7a37225d4ce0b93d0472ccdf6d679a6531262b76a90178df52c94c"} Nov 27 16:57:36 crc kubenswrapper[4707]: I1127 16:57:36.611493 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerStarted","Data":"1096705202ffc4c613d9c9b4692b600d4e9a239a5637512a86d73bdf96685686"} Nov 27 16:57:36 crc kubenswrapper[4707]: I1127 16:57:36.611779 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerStarted","Data":"fcd7c3d1a0e8a381369617382222e771bb4e8d57f1514f242e0a663fa352e82e"} Nov 27 16:57:38 crc kubenswrapper[4707]: I1127 16:57:38.197021 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:57:38 crc kubenswrapper[4707]: E1127 16:57:38.197606 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:57:51 crc kubenswrapper[4707]: I1127 16:57:51.195677 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:57:51 crc kubenswrapper[4707]: E1127 16:57:51.196525 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:58:04 crc kubenswrapper[4707]: I1127 16:58:04.196975 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:58:04 crc kubenswrapper[4707]: E1127 16:58:04.197811 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:58:19 crc kubenswrapper[4707]: I1127 16:58:19.195343 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:58:19 crc kubenswrapper[4707]: E1127 16:58:19.197174 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:58:31 crc kubenswrapper[4707]: I1127 16:58:31.197532 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:58:31 crc kubenswrapper[4707]: E1127 16:58:31.198692 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:58:46 crc kubenswrapper[4707]: I1127 16:58:46.196165 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:58:46 crc kubenswrapper[4707]: E1127 16:58:46.197288 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:58:59 crc kubenswrapper[4707]: I1127 16:58:59.195811 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:58:59 crc kubenswrapper[4707]: E1127 16:58:59.197070 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 16:59:10 crc kubenswrapper[4707]: I1127 16:59:10.195945 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 16:59:10 crc kubenswrapper[4707]: I1127 16:59:10.683791 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"abc6bd7ef76faec628dfd8360ab3a8cc28efd5faac28531e1a0d9c9576ea3f86"} Nov 27 16:59:10 crc kubenswrapper[4707]: I1127 16:59:10.703777 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=96.104367069 podStartE2EDuration="1m39.703758706s" podCreationTimestamp="2025-11-27 16:57:31 +0000 UTC" firstStartedPulling="2025-11-27 16:57:32.695357914 +0000 UTC m=+3228.326806692" lastFinishedPulling="2025-11-27 16:57:36.294749551 +0000 UTC m=+3231.926198329" observedRunningTime="2025-11-27 16:57:36.640348036 +0000 UTC m=+3232.271796814" watchObservedRunningTime="2025-11-27 16:59:10.703758706 +0000 UTC m=+3326.335207474" Nov 27 16:59:37 crc kubenswrapper[4707]: I1127 16:59:37.293632 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/manager/0.log" Nov 27 16:59:39 crc kubenswrapper[4707]: I1127 16:59:39.339083 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:59:39 crc kubenswrapper[4707]: I1127 16:59:39.339699 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="prometheus" containerID="cri-o://fbf62ab5cbe7e4b868486610d78d9a511e46167fd1a60ccb688d1e4be54f79ba" gracePeriod=600 Nov 27 16:59:39 crc kubenswrapper[4707]: I1127 16:59:39.340182 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="config-reloader" containerID="cri-o://f64d1db92398691d7a8708ca36a72654489b8a39e1bf795d4684d045bc67c7e5" gracePeriod=600 Nov 27 16:59:39 crc kubenswrapper[4707]: I1127 16:59:39.340188 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="thanos-sidecar" containerID="cri-o://8cb294eb708386b32b85e50ab0c3bf0011f9edfb03d1d38e5d3e1cc2e614cd76" gracePeriod=600 Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.020028 4707 generic.go:334] "Generic (PLEG): container finished" podID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerID="8cb294eb708386b32b85e50ab0c3bf0011f9edfb03d1d38e5d3e1cc2e614cd76" exitCode=0 Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.020401 4707 generic.go:334] "Generic (PLEG): container finished" podID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerID="f64d1db92398691d7a8708ca36a72654489b8a39e1bf795d4684d045bc67c7e5" exitCode=0 Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.020415 4707 generic.go:334] "Generic (PLEG): container finished" podID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerID="fbf62ab5cbe7e4b868486610d78d9a511e46167fd1a60ccb688d1e4be54f79ba" exitCode=0 Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.020194 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerDied","Data":"8cb294eb708386b32b85e50ab0c3bf0011f9edfb03d1d38e5d3e1cc2e614cd76"} Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.020459 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerDied","Data":"f64d1db92398691d7a8708ca36a72654489b8a39e1bf795d4684d045bc67c7e5"} Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.020477 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerDied","Data":"fbf62ab5cbe7e4b868486610d78d9a511e46167fd1a60ccb688d1e4be54f79ba"} Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.463573 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.633391 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5g8g\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-kube-api-access-v5g8g\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.633436 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.633458 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.633551 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.634187 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.634238 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config-out\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.634285 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-thanos-prometheus-http-client-file\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.634341 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-tls-assets\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.634393 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-secret-combined-ca-bundle\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.634428 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8635e90e-4a8c-4332-ad1c-1d46f8a15834-prometheus-metric-storage-rulefiles-0\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.634497 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config\") pod \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\" (UID: \"8635e90e-4a8c-4332-ad1c-1d46f8a15834\") " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.635354 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8635e90e-4a8c-4332-ad1c-1d46f8a15834-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.639780 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.640549 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config" (OuterVolumeSpecName: "config") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.642290 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config-out" (OuterVolumeSpecName: "config-out") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.644939 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.644948 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.646248 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.654482 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-kube-api-access-v5g8g" (OuterVolumeSpecName: "kube-api-access-v5g8g") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "kube-api-access-v5g8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.662709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.667995 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.725676 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config" (OuterVolumeSpecName: "web-config") pod "8635e90e-4a8c-4332-ad1c-1d46f8a15834" (UID: "8635e90e-4a8c-4332-ad1c-1d46f8a15834"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737440 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-tls-assets\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737472 4707 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737484 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8635e90e-4a8c-4332-ad1c-1d46f8a15834-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737493 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737504 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5g8g\" (UniqueName: \"kubernetes.io/projected/8635e90e-4a8c-4332-ad1c-1d46f8a15834-kube-api-access-v5g8g\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737512 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737524 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737536 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737566 4707 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") on node \"crc\" " Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737578 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8635e90e-4a8c-4332-ad1c-1d46f8a15834-config-out\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.737587 4707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8635e90e-4a8c-4332-ad1c-1d46f8a15834-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.766634 4707 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.766916 4707 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09") on node "crc" Nov 27 16:59:40 crc kubenswrapper[4707]: I1127 16:59:40.858009 4707 reconciler_common.go:293] "Volume detached for volume \"pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e34e9c0-2aa8-41c4-8b3b-646a52e4cb09\") on node \"crc\" DevicePath \"\"" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.045713 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8635e90e-4a8c-4332-ad1c-1d46f8a15834","Type":"ContainerDied","Data":"7f5448391d60bb775b2682bf8bbba49e98ac8af80f01cbc6fcd654b0737ac5ea"} Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.045830 4707 scope.go:117] "RemoveContainer" containerID="8cb294eb708386b32b85e50ab0c3bf0011f9edfb03d1d38e5d3e1cc2e614cd76" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.045762 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.076457 4707 scope.go:117] "RemoveContainer" containerID="f64d1db92398691d7a8708ca36a72654489b8a39e1bf795d4684d045bc67c7e5" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.108134 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.123531 4707 scope.go:117] "RemoveContainer" containerID="fbf62ab5cbe7e4b868486610d78d9a511e46167fd1a60ccb688d1e4be54f79ba" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.123657 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.163829 4707 scope.go:117] "RemoveContainer" containerID="a578aa3565790a0799de06a734dcd9d99071d9b5e1f1006addbd6b20994966fc" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.211269 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" path="/var/lib/kubelet/pods/8635e90e-4a8c-4332-ad1c-1d46f8a15834/volumes" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.944193 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:59:41 crc kubenswrapper[4707]: E1127 16:59:41.944818 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="init-config-reloader" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.944832 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="init-config-reloader" Nov 27 16:59:41 crc kubenswrapper[4707]: E1127 16:59:41.944842 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="prometheus" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.944848 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="prometheus" Nov 27 16:59:41 crc kubenswrapper[4707]: E1127 16:59:41.944878 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="config-reloader" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.944884 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="config-reloader" Nov 27 16:59:41 crc kubenswrapper[4707]: E1127 16:59:41.944897 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="thanos-sidecar" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.944903 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="thanos-sidecar" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.945093 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="prometheus" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.945109 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="thanos-sidecar" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.945121 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="8635e90e-4a8c-4332-ad1c-1d46f8a15834" containerName="config-reloader" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.946820 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.953443 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.953474 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.954666 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.954753 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.961520 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.961647 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fmtlt" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.973349 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 27 16:59:41 crc kubenswrapper[4707]: I1127 16:59:41.982476 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.081913 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.081993 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082078 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082144 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082225 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082285 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082313 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082361 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082400 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082461 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.082501 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-529bp\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-kube-api-access-529bp\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.184661 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.184744 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.184790 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.184823 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.185599 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.185632 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.185685 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.185768 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-529bp\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-kube-api-access-529bp\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.185792 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.185812 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.185874 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.185917 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.186243 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.189135 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.189577 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.189962 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.190061 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.191822 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.192415 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.201879 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.202445 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.203029 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-529bp\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-kube-api-access-529bp\") pod \"prometheus-metric-storage-0\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.266935 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 16:59:42 crc kubenswrapper[4707]: I1127 16:59:42.797577 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 16:59:43 crc kubenswrapper[4707]: I1127 16:59:43.074900 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerStarted","Data":"9c4d3a8633f2903987e14b7bfb74e4fe48b0382a1c5583ed3edb50916efd9bdb"} Nov 27 16:59:47 crc kubenswrapper[4707]: I1127 16:59:47.111495 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerStarted","Data":"7aabd4fd4205c7083902d2bdf70c45a7e25d886d6ae0383d1ec414e6f4d6ea07"} Nov 27 16:59:54 crc kubenswrapper[4707]: I1127 16:59:54.184434 4707 generic.go:334] "Generic (PLEG): container finished" podID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerID="7aabd4fd4205c7083902d2bdf70c45a7e25d886d6ae0383d1ec414e6f4d6ea07" exitCode=0 Nov 27 16:59:54 crc kubenswrapper[4707]: I1127 16:59:54.184688 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerDied","Data":"7aabd4fd4205c7083902d2bdf70c45a7e25d886d6ae0383d1ec414e6f4d6ea07"} Nov 27 16:59:55 crc kubenswrapper[4707]: I1127 16:59:55.193557 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerStarted","Data":"ae59ec07927f28e7a40b11d5fff90f71cfb0e05903ac5927ea79312ebddd67b9"} Nov 27 16:59:59 crc kubenswrapper[4707]: I1127 16:59:59.254841 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerStarted","Data":"1aee67f5cd4117fcdf0b27d0fa8a145b51466f56406e4bf42cba527ea1f87e2d"} Nov 27 16:59:59 crc kubenswrapper[4707]: I1127 16:59:59.255517 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerStarted","Data":"ea67e6957ec36272d33a3773f89596ac9880bf42fe7044403eee87d61c828a95"} Nov 27 16:59:59 crc kubenswrapper[4707]: I1127 16:59:59.294797 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.294776029 podStartE2EDuration="18.294776029s" podCreationTimestamp="2025-11-27 16:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 16:59:59.292068221 +0000 UTC m=+3374.923516999" watchObservedRunningTime="2025-11-27 16:59:59.294776029 +0000 UTC m=+3374.926224817" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.161045 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh"] Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.164434 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.166229 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.166626 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.170204 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh"] Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.265308 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f92ce3f4-9584-4055-9c5c-3d305c031891-secret-volume\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.265419 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9s2s\" (UniqueName: \"kubernetes.io/projected/f92ce3f4-9584-4055-9c5c-3d305c031891-kube-api-access-j9s2s\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.265541 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f92ce3f4-9584-4055-9c5c-3d305c031891-config-volume\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.367915 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f92ce3f4-9584-4055-9c5c-3d305c031891-config-volume\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.368040 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f92ce3f4-9584-4055-9c5c-3d305c031891-secret-volume\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.368123 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9s2s\" (UniqueName: \"kubernetes.io/projected/f92ce3f4-9584-4055-9c5c-3d305c031891-kube-api-access-j9s2s\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.368939 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f92ce3f4-9584-4055-9c5c-3d305c031891-config-volume\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.379783 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f92ce3f4-9584-4055-9c5c-3d305c031891-secret-volume\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.386904 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9s2s\" (UniqueName: \"kubernetes.io/projected/f92ce3f4-9584-4055-9c5c-3d305c031891-kube-api-access-j9s2s\") pod \"collect-profiles-29404380-h64sh\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.483994 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:00 crc kubenswrapper[4707]: I1127 17:00:00.949121 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh"] Nov 27 17:00:00 crc kubenswrapper[4707]: W1127 17:00:00.950562 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf92ce3f4_9584_4055_9c5c_3d305c031891.slice/crio-4ad37c1ae832feacb1ad3b439c2ad4960be1abf90c2c80fd17805b196ac53ee6 WatchSource:0}: Error finding container 4ad37c1ae832feacb1ad3b439c2ad4960be1abf90c2c80fd17805b196ac53ee6: Status 404 returned error can't find the container with id 4ad37c1ae832feacb1ad3b439c2ad4960be1abf90c2c80fd17805b196ac53ee6 Nov 27 17:00:01 crc kubenswrapper[4707]: I1127 17:00:01.274629 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" event={"ID":"f92ce3f4-9584-4055-9c5c-3d305c031891","Type":"ContainerStarted","Data":"629355473b5c123e4a34bd288a871ca318adc1548ad2edcfff9874fc1e140fcc"} Nov 27 17:00:01 crc kubenswrapper[4707]: I1127 17:00:01.274672 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" event={"ID":"f92ce3f4-9584-4055-9c5c-3d305c031891","Type":"ContainerStarted","Data":"4ad37c1ae832feacb1ad3b439c2ad4960be1abf90c2c80fd17805b196ac53ee6"} Nov 27 17:00:01 crc kubenswrapper[4707]: I1127 17:00:01.295030 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" podStartSLOduration=1.2950088819999999 podStartE2EDuration="1.295008882s" podCreationTimestamp="2025-11-27 17:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:00:01.289205988 +0000 UTC m=+3376.920654776" watchObservedRunningTime="2025-11-27 17:00:01.295008882 +0000 UTC m=+3376.926457660" Nov 27 17:00:02 crc kubenswrapper[4707]: I1127 17:00:02.267858 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 27 17:00:02 crc kubenswrapper[4707]: I1127 17:00:02.291356 4707 generic.go:334] "Generic (PLEG): container finished" podID="f92ce3f4-9584-4055-9c5c-3d305c031891" containerID="629355473b5c123e4a34bd288a871ca318adc1548ad2edcfff9874fc1e140fcc" exitCode=0 Nov 27 17:00:02 crc kubenswrapper[4707]: I1127 17:00:02.291440 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" event={"ID":"f92ce3f4-9584-4055-9c5c-3d305c031891","Type":"ContainerDied","Data":"629355473b5c123e4a34bd288a871ca318adc1548ad2edcfff9874fc1e140fcc"} Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.745304 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.851553 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9s2s\" (UniqueName: \"kubernetes.io/projected/f92ce3f4-9584-4055-9c5c-3d305c031891-kube-api-access-j9s2s\") pod \"f92ce3f4-9584-4055-9c5c-3d305c031891\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.851669 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f92ce3f4-9584-4055-9c5c-3d305c031891-config-volume\") pod \"f92ce3f4-9584-4055-9c5c-3d305c031891\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.851849 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f92ce3f4-9584-4055-9c5c-3d305c031891-secret-volume\") pod \"f92ce3f4-9584-4055-9c5c-3d305c031891\" (UID: \"f92ce3f4-9584-4055-9c5c-3d305c031891\") " Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.852444 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f92ce3f4-9584-4055-9c5c-3d305c031891-config-volume" (OuterVolumeSpecName: "config-volume") pod "f92ce3f4-9584-4055-9c5c-3d305c031891" (UID: "f92ce3f4-9584-4055-9c5c-3d305c031891"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.853009 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f92ce3f4-9584-4055-9c5c-3d305c031891-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.856930 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f92ce3f4-9584-4055-9c5c-3d305c031891-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f92ce3f4-9584-4055-9c5c-3d305c031891" (UID: "f92ce3f4-9584-4055-9c5c-3d305c031891"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.861582 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92ce3f4-9584-4055-9c5c-3d305c031891-kube-api-access-j9s2s" (OuterVolumeSpecName: "kube-api-access-j9s2s") pod "f92ce3f4-9584-4055-9c5c-3d305c031891" (UID: "f92ce3f4-9584-4055-9c5c-3d305c031891"). InnerVolumeSpecName "kube-api-access-j9s2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.955095 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9s2s\" (UniqueName: \"kubernetes.io/projected/f92ce3f4-9584-4055-9c5c-3d305c031891-kube-api-access-j9s2s\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:03 crc kubenswrapper[4707]: I1127 17:00:03.955136 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f92ce3f4-9584-4055-9c5c-3d305c031891-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:00:04 crc kubenswrapper[4707]: I1127 17:00:04.312279 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" event={"ID":"f92ce3f4-9584-4055-9c5c-3d305c031891","Type":"ContainerDied","Data":"4ad37c1ae832feacb1ad3b439c2ad4960be1abf90c2c80fd17805b196ac53ee6"} Nov 27 17:00:04 crc kubenswrapper[4707]: I1127 17:00:04.312701 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ad37c1ae832feacb1ad3b439c2ad4960be1abf90c2c80fd17805b196ac53ee6" Nov 27 17:00:04 crc kubenswrapper[4707]: I1127 17:00:04.312351 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404380-h64sh" Nov 27 17:00:04 crc kubenswrapper[4707]: I1127 17:00:04.355019 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8"] Nov 27 17:00:04 crc kubenswrapper[4707]: I1127 17:00:04.365091 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404335-sgbd8"] Nov 27 17:00:05 crc kubenswrapper[4707]: I1127 17:00:05.220876 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de6811b-581b-4c53-a730-bc307193878c" path="/var/lib/kubelet/pods/4de6811b-581b-4c53-a730-bc307193878c/volumes" Nov 27 17:00:12 crc kubenswrapper[4707]: I1127 17:00:12.267200 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 27 17:00:12 crc kubenswrapper[4707]: I1127 17:00:12.276048 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 27 17:00:12 crc kubenswrapper[4707]: I1127 17:00:12.413096 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 27 17:00:14 crc kubenswrapper[4707]: I1127 17:00:14.669675 4707 scope.go:117] "RemoveContainer" containerID="0450cc571faa30b7a6197a322deeea816f75a547912e9c35dab835a6a2e2632e" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.160689 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29404381-kd2qt"] Nov 27 17:01:00 crc kubenswrapper[4707]: E1127 17:01:00.162465 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92ce3f4-9584-4055-9c5c-3d305c031891" containerName="collect-profiles" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.162486 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92ce3f4-9584-4055-9c5c-3d305c031891" containerName="collect-profiles" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.162731 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92ce3f4-9584-4055-9c5c-3d305c031891" containerName="collect-profiles" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.163584 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.181581 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29404381-kd2qt"] Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.242202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-combined-ca-bundle\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.242357 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg8b9\" (UniqueName: \"kubernetes.io/projected/87152575-e530-4043-87fb-c7e50bfa9f00-kube-api-access-rg8b9\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.242437 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-config-data\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.242828 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-fernet-keys\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.344873 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-combined-ca-bundle\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.345291 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg8b9\" (UniqueName: \"kubernetes.io/projected/87152575-e530-4043-87fb-c7e50bfa9f00-kube-api-access-rg8b9\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.345334 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-config-data\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.345460 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-fernet-keys\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.351425 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-config-data\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.351753 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-combined-ca-bundle\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.352528 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-fernet-keys\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.367787 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg8b9\" (UniqueName: \"kubernetes.io/projected/87152575-e530-4043-87fb-c7e50bfa9f00-kube-api-access-rg8b9\") pod \"keystone-cron-29404381-kd2qt\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.486483 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:00 crc kubenswrapper[4707]: I1127 17:01:00.962010 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29404381-kd2qt"] Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.126585 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404381-kd2qt" event={"ID":"87152575-e530-4043-87fb-c7e50bfa9f00","Type":"ContainerStarted","Data":"7c6e039a4544d0f98df679c85f4eba8c4339fa0d22cda188ade85677bee4bedb"} Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.263795 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ssnjx"] Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.265728 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.280250 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ssnjx"] Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.464695 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2kh\" (UniqueName: \"kubernetes.io/projected/c7921aea-6581-427f-8c05-e3162d9957d4-kube-api-access-mt2kh\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.465142 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-utilities\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.465253 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-catalog-content\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.566757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-catalog-content\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.566846 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2kh\" (UniqueName: \"kubernetes.io/projected/c7921aea-6581-427f-8c05-e3162d9957d4-kube-api-access-mt2kh\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.566914 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-utilities\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.567348 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-catalog-content\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.567367 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-utilities\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.597451 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2kh\" (UniqueName: \"kubernetes.io/projected/c7921aea-6581-427f-8c05-e3162d9957d4-kube-api-access-mt2kh\") pod \"certified-operators-ssnjx\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:01 crc kubenswrapper[4707]: I1127 17:01:01.895585 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:02 crc kubenswrapper[4707]: I1127 17:01:02.100871 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-wdw97"] Nov 27 17:01:02 crc kubenswrapper[4707]: I1127 17:01:02.111932 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-e24b-account-create-update-8kjqj"] Nov 27 17:01:02 crc kubenswrapper[4707]: I1127 17:01:02.121410 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-wdw97"] Nov 27 17:01:02 crc kubenswrapper[4707]: I1127 17:01:02.130801 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-e24b-account-create-update-8kjqj"] Nov 27 17:01:02 crc kubenswrapper[4707]: I1127 17:01:02.167116 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404381-kd2qt" event={"ID":"87152575-e530-4043-87fb-c7e50bfa9f00","Type":"ContainerStarted","Data":"1e8091579291705595b675ff045e409d0517fbfb764985744278b996d60a6bb2"} Nov 27 17:01:02 crc kubenswrapper[4707]: I1127 17:01:02.193627 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29404381-kd2qt" podStartSLOduration=2.193607776 podStartE2EDuration="2.193607776s" podCreationTimestamp="2025-11-27 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:01:02.18417694 +0000 UTC m=+3437.815625708" watchObservedRunningTime="2025-11-27 17:01:02.193607776 +0000 UTC m=+3437.825056544" Nov 27 17:01:02 crc kubenswrapper[4707]: W1127 17:01:02.467664 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7921aea_6581_427f_8c05_e3162d9957d4.slice/crio-9f24681a1ef65f0886b76e669f904ad20ed9fa4f1a6d4f4671e20075f2a54334 WatchSource:0}: Error finding container 9f24681a1ef65f0886b76e669f904ad20ed9fa4f1a6d4f4671e20075f2a54334: Status 404 returned error can't find the container with id 9f24681a1ef65f0886b76e669f904ad20ed9fa4f1a6d4f4671e20075f2a54334 Nov 27 17:01:02 crc kubenswrapper[4707]: I1127 17:01:02.476642 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ssnjx"] Nov 27 17:01:03 crc kubenswrapper[4707]: I1127 17:01:03.205604 4707 generic.go:334] "Generic (PLEG): container finished" podID="c7921aea-6581-427f-8c05-e3162d9957d4" containerID="d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900" exitCode=0 Nov 27 17:01:03 crc kubenswrapper[4707]: I1127 17:01:03.209074 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:01:03 crc kubenswrapper[4707]: I1127 17:01:03.231752 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7c7c12-1162-4e24-a569-46f941361752" path="/var/lib/kubelet/pods/4c7c7c12-1162-4e24-a569-46f941361752/volumes" Nov 27 17:01:03 crc kubenswrapper[4707]: I1127 17:01:03.232857 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff61a68-9ac0-4fe7-8290-5346e39a3e3c" path="/var/lib/kubelet/pods/4ff61a68-9ac0-4fe7-8290-5346e39a3e3c/volumes" Nov 27 17:01:03 crc kubenswrapper[4707]: I1127 17:01:03.233712 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssnjx" event={"ID":"c7921aea-6581-427f-8c05-e3162d9957d4","Type":"ContainerDied","Data":"d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900"} Nov 27 17:01:03 crc kubenswrapper[4707]: I1127 17:01:03.233757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssnjx" event={"ID":"c7921aea-6581-427f-8c05-e3162d9957d4","Type":"ContainerStarted","Data":"9f24681a1ef65f0886b76e669f904ad20ed9fa4f1a6d4f4671e20075f2a54334"} Nov 27 17:01:05 crc kubenswrapper[4707]: I1127 17:01:05.229603 4707 generic.go:334] "Generic (PLEG): container finished" podID="87152575-e530-4043-87fb-c7e50bfa9f00" containerID="1e8091579291705595b675ff045e409d0517fbfb764985744278b996d60a6bb2" exitCode=0 Nov 27 17:01:05 crc kubenswrapper[4707]: I1127 17:01:05.229726 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404381-kd2qt" event={"ID":"87152575-e530-4043-87fb-c7e50bfa9f00","Type":"ContainerDied","Data":"1e8091579291705595b675ff045e409d0517fbfb764985744278b996d60a6bb2"} Nov 27 17:01:05 crc kubenswrapper[4707]: I1127 17:01:05.233237 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssnjx" event={"ID":"c7921aea-6581-427f-8c05-e3162d9957d4","Type":"ContainerStarted","Data":"543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a"} Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.623228 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.776422 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-config-data\") pod \"87152575-e530-4043-87fb-c7e50bfa9f00\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.776631 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg8b9\" (UniqueName: \"kubernetes.io/projected/87152575-e530-4043-87fb-c7e50bfa9f00-kube-api-access-rg8b9\") pod \"87152575-e530-4043-87fb-c7e50bfa9f00\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.776808 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-combined-ca-bundle\") pod \"87152575-e530-4043-87fb-c7e50bfa9f00\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.777692 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-fernet-keys\") pod \"87152575-e530-4043-87fb-c7e50bfa9f00\" (UID: \"87152575-e530-4043-87fb-c7e50bfa9f00\") " Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.782944 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "87152575-e530-4043-87fb-c7e50bfa9f00" (UID: "87152575-e530-4043-87fb-c7e50bfa9f00"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.799304 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87152575-e530-4043-87fb-c7e50bfa9f00-kube-api-access-rg8b9" (OuterVolumeSpecName: "kube-api-access-rg8b9") pod "87152575-e530-4043-87fb-c7e50bfa9f00" (UID: "87152575-e530-4043-87fb-c7e50bfa9f00"). InnerVolumeSpecName "kube-api-access-rg8b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.811340 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87152575-e530-4043-87fb-c7e50bfa9f00" (UID: "87152575-e530-4043-87fb-c7e50bfa9f00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.870336 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-config-data" (OuterVolumeSpecName: "config-data") pod "87152575-e530-4043-87fb-c7e50bfa9f00" (UID: "87152575-e530-4043-87fb-c7e50bfa9f00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.879800 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.879847 4707 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.879858 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87152575-e530-4043-87fb-c7e50bfa9f00-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:06 crc kubenswrapper[4707]: I1127 17:01:06.879869 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg8b9\" (UniqueName: \"kubernetes.io/projected/87152575-e530-4043-87fb-c7e50bfa9f00-kube-api-access-rg8b9\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:07 crc kubenswrapper[4707]: I1127 17:01:07.263524 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404381-kd2qt" event={"ID":"87152575-e530-4043-87fb-c7e50bfa9f00","Type":"ContainerDied","Data":"7c6e039a4544d0f98df679c85f4eba8c4339fa0d22cda188ade85677bee4bedb"} Nov 27 17:01:07 crc kubenswrapper[4707]: I1127 17:01:07.263550 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404381-kd2qt" Nov 27 17:01:07 crc kubenswrapper[4707]: I1127 17:01:07.263565 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6e039a4544d0f98df679c85f4eba8c4339fa0d22cda188ade85677bee4bedb" Nov 27 17:01:07 crc kubenswrapper[4707]: I1127 17:01:07.265992 4707 generic.go:334] "Generic (PLEG): container finished" podID="c7921aea-6581-427f-8c05-e3162d9957d4" containerID="543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a" exitCode=0 Nov 27 17:01:07 crc kubenswrapper[4707]: I1127 17:01:07.265996 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssnjx" event={"ID":"c7921aea-6581-427f-8c05-e3162d9957d4","Type":"ContainerDied","Data":"543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a"} Nov 27 17:01:09 crc kubenswrapper[4707]: I1127 17:01:09.287526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssnjx" event={"ID":"c7921aea-6581-427f-8c05-e3162d9957d4","Type":"ContainerStarted","Data":"708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b"} Nov 27 17:01:09 crc kubenswrapper[4707]: I1127 17:01:09.308195 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ssnjx" podStartSLOduration=3.402011781 podStartE2EDuration="8.308151406s" podCreationTimestamp="2025-11-27 17:01:01 +0000 UTC" firstStartedPulling="2025-11-27 17:01:03.208743702 +0000 UTC m=+3438.840192470" lastFinishedPulling="2025-11-27 17:01:08.114883317 +0000 UTC m=+3443.746332095" observedRunningTime="2025-11-27 17:01:09.30550482 +0000 UTC m=+3444.936953588" watchObservedRunningTime="2025-11-27 17:01:09.308151406 +0000 UTC m=+3444.939600174" Nov 27 17:01:11 crc kubenswrapper[4707]: I1127 17:01:11.896554 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:11 crc kubenswrapper[4707]: I1127 17:01:11.897137 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:11 crc kubenswrapper[4707]: I1127 17:01:11.996447 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:14 crc kubenswrapper[4707]: I1127 17:01:14.761056 4707 scope.go:117] "RemoveContainer" containerID="11a7cbd23cb81f2810b908b1e1022edd78a9e3202bb1c260d61c6ad8248edc16" Nov 27 17:01:14 crc kubenswrapper[4707]: I1127 17:01:14.788815 4707 scope.go:117] "RemoveContainer" containerID="0109580f59af85211a67e2385a53d07d7f03e717125345c1a3e72843603fe0ac" Nov 27 17:01:19 crc kubenswrapper[4707]: I1127 17:01:19.064526 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-b5jww"] Nov 27 17:01:19 crc kubenswrapper[4707]: I1127 17:01:19.078061 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-b5jww"] Nov 27 17:01:19 crc kubenswrapper[4707]: I1127 17:01:19.208525 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="838f87e6-51f0-4745-98b6-89ecd7255df4" path="/var/lib/kubelet/pods/838f87e6-51f0-4745-98b6-89ecd7255df4/volumes" Nov 27 17:01:21 crc kubenswrapper[4707]: I1127 17:01:21.983563 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:22 crc kubenswrapper[4707]: I1127 17:01:22.053976 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ssnjx"] Nov 27 17:01:22 crc kubenswrapper[4707]: I1127 17:01:22.427714 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ssnjx" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" containerName="registry-server" containerID="cri-o://708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b" gracePeriod=2 Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.145135 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.340432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt2kh\" (UniqueName: \"kubernetes.io/projected/c7921aea-6581-427f-8c05-e3162d9957d4-kube-api-access-mt2kh\") pod \"c7921aea-6581-427f-8c05-e3162d9957d4\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.340552 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-utilities\") pod \"c7921aea-6581-427f-8c05-e3162d9957d4\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.340667 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-catalog-content\") pod \"c7921aea-6581-427f-8c05-e3162d9957d4\" (UID: \"c7921aea-6581-427f-8c05-e3162d9957d4\") " Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.341560 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-utilities" (OuterVolumeSpecName: "utilities") pod "c7921aea-6581-427f-8c05-e3162d9957d4" (UID: "c7921aea-6581-427f-8c05-e3162d9957d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.342054 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.353038 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7921aea-6581-427f-8c05-e3162d9957d4-kube-api-access-mt2kh" (OuterVolumeSpecName: "kube-api-access-mt2kh") pod "c7921aea-6581-427f-8c05-e3162d9957d4" (UID: "c7921aea-6581-427f-8c05-e3162d9957d4"). InnerVolumeSpecName "kube-api-access-mt2kh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.407264 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7921aea-6581-427f-8c05-e3162d9957d4" (UID: "c7921aea-6581-427f-8c05-e3162d9957d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.442961 4707 generic.go:334] "Generic (PLEG): container finished" podID="c7921aea-6581-427f-8c05-e3162d9957d4" containerID="708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b" exitCode=0 Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.443012 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssnjx" event={"ID":"c7921aea-6581-427f-8c05-e3162d9957d4","Type":"ContainerDied","Data":"708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b"} Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.443072 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ssnjx" event={"ID":"c7921aea-6581-427f-8c05-e3162d9957d4","Type":"ContainerDied","Data":"9f24681a1ef65f0886b76e669f904ad20ed9fa4f1a6d4f4671e20075f2a54334"} Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.443097 4707 scope.go:117] "RemoveContainer" containerID="708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.443278 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt2kh\" (UniqueName: \"kubernetes.io/projected/c7921aea-6581-427f-8c05-e3162d9957d4-kube-api-access-mt2kh\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.443310 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7921aea-6581-427f-8c05-e3162d9957d4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.443627 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ssnjx" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.474426 4707 scope.go:117] "RemoveContainer" containerID="543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.500389 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ssnjx"] Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.504066 4707 scope.go:117] "RemoveContainer" containerID="d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.512672 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ssnjx"] Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.557967 4707 scope.go:117] "RemoveContainer" containerID="708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b" Nov 27 17:01:23 crc kubenswrapper[4707]: E1127 17:01:23.558630 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b\": container with ID starting with 708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b not found: ID does not exist" containerID="708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.558675 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b"} err="failed to get container status \"708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b\": rpc error: code = NotFound desc = could not find container \"708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b\": container with ID starting with 708c50410652e1fa8a29c9f5f20cde7cda0461f90c19d2fd6c3c24a9ac1c015b not found: ID does not exist" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.558707 4707 scope.go:117] "RemoveContainer" containerID="543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a" Nov 27 17:01:23 crc kubenswrapper[4707]: E1127 17:01:23.559160 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a\": container with ID starting with 543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a not found: ID does not exist" containerID="543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.559188 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a"} err="failed to get container status \"543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a\": rpc error: code = NotFound desc = could not find container \"543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a\": container with ID starting with 543a1427fe855e7cc6bec8c55da10d245ea81efad9b5489e2ec685abd979ed8a not found: ID does not exist" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.559207 4707 scope.go:117] "RemoveContainer" containerID="d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900" Nov 27 17:01:23 crc kubenswrapper[4707]: E1127 17:01:23.559907 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900\": container with ID starting with d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900 not found: ID does not exist" containerID="d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900" Nov 27 17:01:23 crc kubenswrapper[4707]: I1127 17:01:23.559930 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900"} err="failed to get container status \"d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900\": rpc error: code = NotFound desc = could not find container \"d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900\": container with ID starting with d098a3b86ca452c0d701d3aa0c1cf558b5b86f5db8069c8e143df4a89bc4f900 not found: ID does not exist" Nov 27 17:01:25 crc kubenswrapper[4707]: I1127 17:01:25.212736 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" path="/var/lib/kubelet/pods/c7921aea-6581-427f-8c05-e3162d9957d4/volumes" Nov 27 17:01:33 crc kubenswrapper[4707]: I1127 17:01:33.623485 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:01:33 crc kubenswrapper[4707]: I1127 17:01:33.624055 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:01:39 crc kubenswrapper[4707]: I1127 17:01:39.135113 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/manager/0.log" Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.433765 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.434413 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-api" containerID="cri-o://bbe473e6bba4d6205f391e056ba7ee318c81167278b2c8acc0321def0d04bf7b" gracePeriod=30 Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.434517 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-listener" containerID="cri-o://1096705202ffc4c613d9c9b4692b600d4e9a239a5637512a86d73bdf96685686" gracePeriod=30 Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.434573 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-evaluator" containerID="cri-o://c4b9fbb6df7a37225d4ce0b93d0472ccdf6d679a6531262b76a90178df52c94c" gracePeriod=30 Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.434686 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-notifier" containerID="cri-o://fcd7c3d1a0e8a381369617382222e771bb4e8d57f1514f242e0a663fa352e82e" gracePeriod=30 Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.910215 4707 generic.go:334] "Generic (PLEG): container finished" podID="6e2538ea-8394-4e99-8f0c-74895d703440" containerID="c4b9fbb6df7a37225d4ce0b93d0472ccdf6d679a6531262b76a90178df52c94c" exitCode=0 Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.910246 4707 generic.go:334] "Generic (PLEG): container finished" podID="6e2538ea-8394-4e99-8f0c-74895d703440" containerID="bbe473e6bba4d6205f391e056ba7ee318c81167278b2c8acc0321def0d04bf7b" exitCode=0 Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.910265 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerDied","Data":"c4b9fbb6df7a37225d4ce0b93d0472ccdf6d679a6531262b76a90178df52c94c"} Nov 27 17:01:40 crc kubenswrapper[4707]: I1127 17:01:40.910289 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerDied","Data":"bbe473e6bba4d6205f391e056ba7ee318c81167278b2c8acc0321def0d04bf7b"} Nov 27 17:01:43 crc kubenswrapper[4707]: I1127 17:01:43.944742 4707 generic.go:334] "Generic (PLEG): container finished" podID="6e2538ea-8394-4e99-8f0c-74895d703440" containerID="1096705202ffc4c613d9c9b4692b600d4e9a239a5637512a86d73bdf96685686" exitCode=0 Nov 27 17:01:43 crc kubenswrapper[4707]: I1127 17:01:43.944814 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerDied","Data":"1096705202ffc4c613d9c9b4692b600d4e9a239a5637512a86d73bdf96685686"} Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.017581 4707 generic.go:334] "Generic (PLEG): container finished" podID="6e2538ea-8394-4e99-8f0c-74895d703440" containerID="fcd7c3d1a0e8a381369617382222e771bb4e8d57f1514f242e0a663fa352e82e" exitCode=0 Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.017683 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerDied","Data":"fcd7c3d1a0e8a381369617382222e771bb4e8d57f1514f242e0a663fa352e82e"} Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.322743 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.333055 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-public-tls-certs\") pod \"6e2538ea-8394-4e99-8f0c-74895d703440\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.333094 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsvrm\" (UniqueName: \"kubernetes.io/projected/6e2538ea-8394-4e99-8f0c-74895d703440-kube-api-access-fsvrm\") pod \"6e2538ea-8394-4e99-8f0c-74895d703440\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.333229 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-internal-tls-certs\") pod \"6e2538ea-8394-4e99-8f0c-74895d703440\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.333249 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-scripts\") pod \"6e2538ea-8394-4e99-8f0c-74895d703440\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.333279 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-combined-ca-bundle\") pod \"6e2538ea-8394-4e99-8f0c-74895d703440\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.333298 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-config-data\") pod \"6e2538ea-8394-4e99-8f0c-74895d703440\" (UID: \"6e2538ea-8394-4e99-8f0c-74895d703440\") " Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.340611 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2538ea-8394-4e99-8f0c-74895d703440-kube-api-access-fsvrm" (OuterVolumeSpecName: "kube-api-access-fsvrm") pod "6e2538ea-8394-4e99-8f0c-74895d703440" (UID: "6e2538ea-8394-4e99-8f0c-74895d703440"). InnerVolumeSpecName "kube-api-access-fsvrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.349614 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-scripts" (OuterVolumeSpecName: "scripts") pod "6e2538ea-8394-4e99-8f0c-74895d703440" (UID: "6e2538ea-8394-4e99-8f0c-74895d703440"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.432263 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6e2538ea-8394-4e99-8f0c-74895d703440" (UID: "6e2538ea-8394-4e99-8f0c-74895d703440"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.435985 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsvrm\" (UniqueName: \"kubernetes.io/projected/6e2538ea-8394-4e99-8f0c-74895d703440-kube-api-access-fsvrm\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.436008 4707 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.436016 4707 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.459112 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6e2538ea-8394-4e99-8f0c-74895d703440" (UID: "6e2538ea-8394-4e99-8f0c-74895d703440"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.486566 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-config-data" (OuterVolumeSpecName: "config-data") pod "6e2538ea-8394-4e99-8f0c-74895d703440" (UID: "6e2538ea-8394-4e99-8f0c-74895d703440"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.512687 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e2538ea-8394-4e99-8f0c-74895d703440" (UID: "6e2538ea-8394-4e99-8f0c-74895d703440"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.537760 4707 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.537786 4707 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:50 crc kubenswrapper[4707]: I1127 17:01:50.537795 4707 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e2538ea-8394-4e99-8f0c-74895d703440-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.042044 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6e2538ea-8394-4e99-8f0c-74895d703440","Type":"ContainerDied","Data":"2ad324d6c714e839d6623c90aa798ae2c38c69620d5e877815813f20e80770cf"} Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.042122 4707 scope.go:117] "RemoveContainer" containerID="1096705202ffc4c613d9c9b4692b600d4e9a239a5637512a86d73bdf96685686" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.042286 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.085015 4707 scope.go:117] "RemoveContainer" containerID="fcd7c3d1a0e8a381369617382222e771bb4e8d57f1514f242e0a663fa352e82e" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.098458 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.110525 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.132359 4707 scope.go:117] "RemoveContainer" containerID="c4b9fbb6df7a37225d4ce0b93d0472ccdf6d679a6531262b76a90178df52c94c" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.132551 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 27 17:01:51 crc kubenswrapper[4707]: E1127 17:01:51.133339 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87152575-e530-4043-87fb-c7e50bfa9f00" containerName="keystone-cron" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133357 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="87152575-e530-4043-87fb-c7e50bfa9f00" containerName="keystone-cron" Nov 27 17:01:51 crc kubenswrapper[4707]: E1127 17:01:51.133406 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-listener" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133413 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-listener" Nov 27 17:01:51 crc kubenswrapper[4707]: E1127 17:01:51.133430 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" containerName="extract-content" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133436 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" containerName="extract-content" Nov 27 17:01:51 crc kubenswrapper[4707]: E1127 17:01:51.133443 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-api" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133449 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-api" Nov 27 17:01:51 crc kubenswrapper[4707]: E1127 17:01:51.133462 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" containerName="registry-server" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133468 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" containerName="registry-server" Nov 27 17:01:51 crc kubenswrapper[4707]: E1127 17:01:51.133483 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-notifier" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133489 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-notifier" Nov 27 17:01:51 crc kubenswrapper[4707]: E1127 17:01:51.133507 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" containerName="extract-utilities" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133513 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" containerName="extract-utilities" Nov 27 17:01:51 crc kubenswrapper[4707]: E1127 17:01:51.133534 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-evaluator" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133542 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-evaluator" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133812 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-listener" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133846 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-api" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133855 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-evaluator" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133865 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7921aea-6581-427f-8c05-e3162d9957d4" containerName="registry-server" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133885 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="87152575-e530-4043-87fb-c7e50bfa9f00" containerName="keystone-cron" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.133899 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" containerName="aodh-notifier" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.137460 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.140603 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.140792 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.141053 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.141249 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-pfvsr" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.149006 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-public-tls-certs\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.149057 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.149108 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-internal-tls-certs\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.149136 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s542s\" (UniqueName: \"kubernetes.io/projected/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-kube-api-access-s542s\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.149227 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-config-data\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.149245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-scripts\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.152750 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.156318 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.198868 4707 scope.go:117] "RemoveContainer" containerID="bbe473e6bba4d6205f391e056ba7ee318c81167278b2c8acc0321def0d04bf7b" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.205614 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2538ea-8394-4e99-8f0c-74895d703440" path="/var/lib/kubelet/pods/6e2538ea-8394-4e99-8f0c-74895d703440/volumes" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.250996 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-config-data\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.251044 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-scripts\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.251108 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-public-tls-certs\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.251132 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.251171 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-internal-tls-certs\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.251192 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s542s\" (UniqueName: \"kubernetes.io/projected/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-kube-api-access-s542s\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.256934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.257529 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-public-tls-certs\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.257548 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-config-data\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.258616 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-internal-tls-certs\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.264487 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-scripts\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.270097 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s542s\" (UniqueName: \"kubernetes.io/projected/6282f04c-ddbf-46d2-a5ac-ba7550ff2559-kube-api-access-s542s\") pod \"aodh-0\" (UID: \"6282f04c-ddbf-46d2-a5ac-ba7550ff2559\") " pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.463429 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 27 17:01:51 crc kubenswrapper[4707]: I1127 17:01:51.923106 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 27 17:01:52 crc kubenswrapper[4707]: I1127 17:01:52.053053 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6282f04c-ddbf-46d2-a5ac-ba7550ff2559","Type":"ContainerStarted","Data":"1f16bfcf1604aa6c4c5a1e87a62536c71d673189c5277300d5fc4ab01687fa9b"} Nov 27 17:01:54 crc kubenswrapper[4707]: I1127 17:01:54.077926 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6282f04c-ddbf-46d2-a5ac-ba7550ff2559","Type":"ContainerStarted","Data":"26cbf491f57f7dbcf09470499d6d77c3f19bf3e5cb6231b901affe43010f88b0"} Nov 27 17:01:55 crc kubenswrapper[4707]: I1127 17:01:55.094325 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6282f04c-ddbf-46d2-a5ac-ba7550ff2559","Type":"ContainerStarted","Data":"59e0870820ced003ea42cb24e525be116b4847f8933948acddab76db04bc83f9"} Nov 27 17:01:56 crc kubenswrapper[4707]: I1127 17:01:56.130595 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6282f04c-ddbf-46d2-a5ac-ba7550ff2559","Type":"ContainerStarted","Data":"e4b469ae22407189989cc6ac53f40cf31743604f8ac751ac7cbf8b596585aa6a"} Nov 27 17:01:57 crc kubenswrapper[4707]: I1127 17:01:57.147667 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6282f04c-ddbf-46d2-a5ac-ba7550ff2559","Type":"ContainerStarted","Data":"458eaf2390d5bb0370b1b929b64a343b79e4abe0920d4612cb353857d721dc29"} Nov 27 17:01:57 crc kubenswrapper[4707]: I1127 17:01:57.204782 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.293411105 podStartE2EDuration="6.204766371s" podCreationTimestamp="2025-11-27 17:01:51 +0000 UTC" firstStartedPulling="2025-11-27 17:01:51.922040483 +0000 UTC m=+3487.553489251" lastFinishedPulling="2025-11-27 17:01:55.833395749 +0000 UTC m=+3491.464844517" observedRunningTime="2025-11-27 17:01:57.174626089 +0000 UTC m=+3492.806074877" watchObservedRunningTime="2025-11-27 17:01:57.204766371 +0000 UTC m=+3492.836215139" Nov 27 17:02:03 crc kubenswrapper[4707]: I1127 17:02:03.623652 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:02:03 crc kubenswrapper[4707]: I1127 17:02:03.624346 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:02:14 crc kubenswrapper[4707]: I1127 17:02:14.944316 4707 scope.go:117] "RemoveContainer" containerID="1a4d6a4fc61e4ce8c025eb63ad9d6f049ed60531a3fffa24c3753a2eccd8baf7" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.273660 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qfb5f"] Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.276349 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.285163 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfb5f"] Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.365422 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-catalog-content\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.365632 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-utilities\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.365812 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lntvp\" (UniqueName: \"kubernetes.io/projected/67f76dc7-e34a-4399-8463-91303c391dcb-kube-api-access-lntvp\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.468408 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-catalog-content\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.468573 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-utilities\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.468754 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lntvp\" (UniqueName: \"kubernetes.io/projected/67f76dc7-e34a-4399-8463-91303c391dcb-kube-api-access-lntvp\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.468966 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-catalog-content\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.469012 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-utilities\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.492083 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lntvp\" (UniqueName: \"kubernetes.io/projected/67f76dc7-e34a-4399-8463-91303c391dcb-kube-api-access-lntvp\") pod \"redhat-marketplace-qfb5f\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:28 crc kubenswrapper[4707]: I1127 17:02:28.623121 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:29 crc kubenswrapper[4707]: I1127 17:02:29.157521 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfb5f"] Nov 27 17:02:29 crc kubenswrapper[4707]: I1127 17:02:29.547524 4707 generic.go:334] "Generic (PLEG): container finished" podID="67f76dc7-e34a-4399-8463-91303c391dcb" containerID="6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f" exitCode=0 Nov 27 17:02:29 crc kubenswrapper[4707]: I1127 17:02:29.547579 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfb5f" event={"ID":"67f76dc7-e34a-4399-8463-91303c391dcb","Type":"ContainerDied","Data":"6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f"} Nov 27 17:02:29 crc kubenswrapper[4707]: I1127 17:02:29.547972 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfb5f" event={"ID":"67f76dc7-e34a-4399-8463-91303c391dcb","Type":"ContainerStarted","Data":"a010e1ff025c4fa1a9896c6398ff60223c054adfa348c28452080bcb41b52013"} Nov 27 17:02:31 crc kubenswrapper[4707]: I1127 17:02:31.570839 4707 generic.go:334] "Generic (PLEG): container finished" podID="67f76dc7-e34a-4399-8463-91303c391dcb" containerID="0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c" exitCode=0 Nov 27 17:02:31 crc kubenswrapper[4707]: I1127 17:02:31.570921 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfb5f" event={"ID":"67f76dc7-e34a-4399-8463-91303c391dcb","Type":"ContainerDied","Data":"0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c"} Nov 27 17:02:32 crc kubenswrapper[4707]: I1127 17:02:32.581812 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfb5f" event={"ID":"67f76dc7-e34a-4399-8463-91303c391dcb","Type":"ContainerStarted","Data":"fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e"} Nov 27 17:02:33 crc kubenswrapper[4707]: I1127 17:02:33.623851 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:02:33 crc kubenswrapper[4707]: I1127 17:02:33.624162 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:02:33 crc kubenswrapper[4707]: I1127 17:02:33.624205 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 17:02:33 crc kubenswrapper[4707]: I1127 17:02:33.624942 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abc6bd7ef76faec628dfd8360ab3a8cc28efd5faac28531e1a0d9c9576ea3f86"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:02:33 crc kubenswrapper[4707]: I1127 17:02:33.624990 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://abc6bd7ef76faec628dfd8360ab3a8cc28efd5faac28531e1a0d9c9576ea3f86" gracePeriod=600 Nov 27 17:02:34 crc kubenswrapper[4707]: I1127 17:02:34.605920 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="abc6bd7ef76faec628dfd8360ab3a8cc28efd5faac28531e1a0d9c9576ea3f86" exitCode=0 Nov 27 17:02:34 crc kubenswrapper[4707]: I1127 17:02:34.606466 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"abc6bd7ef76faec628dfd8360ab3a8cc28efd5faac28531e1a0d9c9576ea3f86"} Nov 27 17:02:34 crc kubenswrapper[4707]: I1127 17:02:34.606494 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c"} Nov 27 17:02:34 crc kubenswrapper[4707]: I1127 17:02:34.606565 4707 scope.go:117] "RemoveContainer" containerID="922f6f006f59568ee3b2de9c7061fa21de9cccddc36312f2d9ef753b4834d084" Nov 27 17:02:34 crc kubenswrapper[4707]: I1127 17:02:34.638660 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qfb5f" podStartSLOduration=3.848062937 podStartE2EDuration="6.638638111s" podCreationTimestamp="2025-11-27 17:02:28 +0000 UTC" firstStartedPulling="2025-11-27 17:02:29.549213036 +0000 UTC m=+3525.180661804" lastFinishedPulling="2025-11-27 17:02:32.33978821 +0000 UTC m=+3527.971236978" observedRunningTime="2025-11-27 17:02:32.602984684 +0000 UTC m=+3528.234433452" watchObservedRunningTime="2025-11-27 17:02:34.638638111 +0000 UTC m=+3530.270086899" Nov 27 17:02:38 crc kubenswrapper[4707]: I1127 17:02:38.623652 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:38 crc kubenswrapper[4707]: I1127 17:02:38.624231 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:38 crc kubenswrapper[4707]: I1127 17:02:38.673670 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:38 crc kubenswrapper[4707]: I1127 17:02:38.733768 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:38 crc kubenswrapper[4707]: I1127 17:02:38.920626 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfb5f"] Nov 27 17:02:40 crc kubenswrapper[4707]: I1127 17:02:40.670454 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qfb5f" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" containerName="registry-server" containerID="cri-o://fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e" gracePeriod=2 Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.279514 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.419206 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lntvp\" (UniqueName: \"kubernetes.io/projected/67f76dc7-e34a-4399-8463-91303c391dcb-kube-api-access-lntvp\") pod \"67f76dc7-e34a-4399-8463-91303c391dcb\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.419560 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-utilities\") pod \"67f76dc7-e34a-4399-8463-91303c391dcb\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.419775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-catalog-content\") pod \"67f76dc7-e34a-4399-8463-91303c391dcb\" (UID: \"67f76dc7-e34a-4399-8463-91303c391dcb\") " Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.420615 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-utilities" (OuterVolumeSpecName: "utilities") pod "67f76dc7-e34a-4399-8463-91303c391dcb" (UID: "67f76dc7-e34a-4399-8463-91303c391dcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.433866 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f76dc7-e34a-4399-8463-91303c391dcb-kube-api-access-lntvp" (OuterVolumeSpecName: "kube-api-access-lntvp") pod "67f76dc7-e34a-4399-8463-91303c391dcb" (UID: "67f76dc7-e34a-4399-8463-91303c391dcb"). InnerVolumeSpecName "kube-api-access-lntvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.439337 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67f76dc7-e34a-4399-8463-91303c391dcb" (UID: "67f76dc7-e34a-4399-8463-91303c391dcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.522959 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lntvp\" (UniqueName: \"kubernetes.io/projected/67f76dc7-e34a-4399-8463-91303c391dcb-kube-api-access-lntvp\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.523005 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.523020 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f76dc7-e34a-4399-8463-91303c391dcb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.683820 4707 generic.go:334] "Generic (PLEG): container finished" podID="67f76dc7-e34a-4399-8463-91303c391dcb" containerID="fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e" exitCode=0 Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.683878 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfb5f" event={"ID":"67f76dc7-e34a-4399-8463-91303c391dcb","Type":"ContainerDied","Data":"fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e"} Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.683919 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qfb5f" event={"ID":"67f76dc7-e34a-4399-8463-91303c391dcb","Type":"ContainerDied","Data":"a010e1ff025c4fa1a9896c6398ff60223c054adfa348c28452080bcb41b52013"} Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.683962 4707 scope.go:117] "RemoveContainer" containerID="fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.684054 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qfb5f" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.704484 4707 scope.go:117] "RemoveContainer" containerID="0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.721148 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfb5f"] Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.730507 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qfb5f"] Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.741883 4707 scope.go:117] "RemoveContainer" containerID="6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.791851 4707 scope.go:117] "RemoveContainer" containerID="fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e" Nov 27 17:02:41 crc kubenswrapper[4707]: E1127 17:02:41.792345 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e\": container with ID starting with fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e not found: ID does not exist" containerID="fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.792419 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e"} err="failed to get container status \"fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e\": rpc error: code = NotFound desc = could not find container \"fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e\": container with ID starting with fb486d854f2ccc03430c40f6aa2745582a311b2c11f65d6ecf3c59a05ec57f8e not found: ID does not exist" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.792452 4707 scope.go:117] "RemoveContainer" containerID="0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c" Nov 27 17:02:41 crc kubenswrapper[4707]: E1127 17:02:41.792842 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c\": container with ID starting with 0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c not found: ID does not exist" containerID="0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.792899 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c"} err="failed to get container status \"0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c\": rpc error: code = NotFound desc = could not find container \"0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c\": container with ID starting with 0e91de95fc4591c98ddff33778c50f54c495c2da49c303318faed61f8185fe1c not found: ID does not exist" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.792936 4707 scope.go:117] "RemoveContainer" containerID="6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f" Nov 27 17:02:41 crc kubenswrapper[4707]: E1127 17:02:41.793269 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f\": container with ID starting with 6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f not found: ID does not exist" containerID="6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f" Nov 27 17:02:41 crc kubenswrapper[4707]: I1127 17:02:41.793304 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f"} err="failed to get container status \"6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f\": rpc error: code = NotFound desc = could not find container \"6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f\": container with ID starting with 6f3bb9030772d0ca05db4c56e8894a75f1ee970ad064b174ed0c03c24090342f not found: ID does not exist" Nov 27 17:02:43 crc kubenswrapper[4707]: I1127 17:02:43.211857 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" path="/var/lib/kubelet/pods/67f76dc7-e34a-4399-8463-91303c391dcb/volumes" Nov 27 17:03:20 crc kubenswrapper[4707]: I1127 17:03:20.967802 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hk9m"] Nov 27 17:03:20 crc kubenswrapper[4707]: E1127 17:03:20.969019 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" containerName="extract-content" Nov 27 17:03:20 crc kubenswrapper[4707]: I1127 17:03:20.969041 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" containerName="extract-content" Nov 27 17:03:20 crc kubenswrapper[4707]: E1127 17:03:20.969082 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" containerName="extract-utilities" Nov 27 17:03:20 crc kubenswrapper[4707]: I1127 17:03:20.969094 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" containerName="extract-utilities" Nov 27 17:03:20 crc kubenswrapper[4707]: E1127 17:03:20.969134 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" containerName="registry-server" Nov 27 17:03:20 crc kubenswrapper[4707]: I1127 17:03:20.969146 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" containerName="registry-server" Nov 27 17:03:20 crc kubenswrapper[4707]: I1127 17:03:20.969509 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f76dc7-e34a-4399-8463-91303c391dcb" containerName="registry-server" Nov 27 17:03:20 crc kubenswrapper[4707]: I1127 17:03:20.971875 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:20 crc kubenswrapper[4707]: I1127 17:03:20.996925 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hk9m"] Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.085187 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-utilities\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.085245 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crghg\" (UniqueName: \"kubernetes.io/projected/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-kube-api-access-crghg\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.085613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-catalog-content\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.186925 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-catalog-content\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.187063 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-utilities\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.187079 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crghg\" (UniqueName: \"kubernetes.io/projected/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-kube-api-access-crghg\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.187503 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-catalog-content\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.187574 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-utilities\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.207156 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crghg\" (UniqueName: \"kubernetes.io/projected/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-kube-api-access-crghg\") pod \"redhat-operators-8hk9m\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.327416 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:21 crc kubenswrapper[4707]: I1127 17:03:21.827875 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hk9m"] Nov 27 17:03:22 crc kubenswrapper[4707]: I1127 17:03:22.130000 4707 generic.go:334] "Generic (PLEG): container finished" podID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerID="a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413" exitCode=0 Nov 27 17:03:22 crc kubenswrapper[4707]: I1127 17:03:22.131546 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hk9m" event={"ID":"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a","Type":"ContainerDied","Data":"a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413"} Nov 27 17:03:22 crc kubenswrapper[4707]: I1127 17:03:22.131652 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hk9m" event={"ID":"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a","Type":"ContainerStarted","Data":"547b500fe5fe79a28f1c5a40343199979a4038c7ebfbf9b77c9a68a26b3c766d"} Nov 27 17:03:24 crc kubenswrapper[4707]: I1127 17:03:24.161795 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hk9m" event={"ID":"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a","Type":"ContainerStarted","Data":"0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe"} Nov 27 17:03:34 crc kubenswrapper[4707]: I1127 17:03:34.274967 4707 generic.go:334] "Generic (PLEG): container finished" podID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerID="0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe" exitCode=0 Nov 27 17:03:34 crc kubenswrapper[4707]: I1127 17:03:34.275051 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hk9m" event={"ID":"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a","Type":"ContainerDied","Data":"0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe"} Nov 27 17:03:37 crc kubenswrapper[4707]: I1127 17:03:37.313089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hk9m" event={"ID":"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a","Type":"ContainerStarted","Data":"176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946"} Nov 27 17:03:37 crc kubenswrapper[4707]: I1127 17:03:37.331487 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hk9m" podStartSLOduration=3.1770457 podStartE2EDuration="17.331469609s" podCreationTimestamp="2025-11-27 17:03:20 +0000 UTC" firstStartedPulling="2025-11-27 17:03:22.133405692 +0000 UTC m=+3577.764854460" lastFinishedPulling="2025-11-27 17:03:36.287829591 +0000 UTC m=+3591.919278369" observedRunningTime="2025-11-27 17:03:37.327422128 +0000 UTC m=+3592.958870926" watchObservedRunningTime="2025-11-27 17:03:37.331469609 +0000 UTC m=+3592.962918377" Nov 27 17:03:40 crc kubenswrapper[4707]: I1127 17:03:40.851276 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/manager/0.log" Nov 27 17:03:41 crc kubenswrapper[4707]: I1127 17:03:41.327921 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:41 crc kubenswrapper[4707]: I1127 17:03:41.328159 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:42 crc kubenswrapper[4707]: I1127 17:03:42.384702 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8hk9m" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="registry-server" probeResult="failure" output=< Nov 27 17:03:42 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Nov 27 17:03:42 crc kubenswrapper[4707]: > Nov 27 17:03:44 crc kubenswrapper[4707]: I1127 17:03:44.414732 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 17:03:44 crc kubenswrapper[4707]: I1127 17:03:44.415384 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="prometheus" containerID="cri-o://ae59ec07927f28e7a40b11d5fff90f71cfb0e05903ac5927ea79312ebddd67b9" gracePeriod=600 Nov 27 17:03:44 crc kubenswrapper[4707]: I1127 17:03:44.416116 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="thanos-sidecar" containerID="cri-o://1aee67f5cd4117fcdf0b27d0fa8a145b51466f56406e4bf42cba527ea1f87e2d" gracePeriod=600 Nov 27 17:03:44 crc kubenswrapper[4707]: I1127 17:03:44.416163 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="config-reloader" containerID="cri-o://ea67e6957ec36272d33a3773f89596ac9880bf42fe7044403eee87d61c828a95" gracePeriod=600 Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.400465 4707 generic.go:334] "Generic (PLEG): container finished" podID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerID="1aee67f5cd4117fcdf0b27d0fa8a145b51466f56406e4bf42cba527ea1f87e2d" exitCode=0 Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.400879 4707 generic.go:334] "Generic (PLEG): container finished" podID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerID="ea67e6957ec36272d33a3773f89596ac9880bf42fe7044403eee87d61c828a95" exitCode=0 Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.400892 4707 generic.go:334] "Generic (PLEG): container finished" podID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerID="ae59ec07927f28e7a40b11d5fff90f71cfb0e05903ac5927ea79312ebddd67b9" exitCode=0 Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.400545 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerDied","Data":"1aee67f5cd4117fcdf0b27d0fa8a145b51466f56406e4bf42cba527ea1f87e2d"} Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.400940 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerDied","Data":"ea67e6957ec36272d33a3773f89596ac9880bf42fe7044403eee87d61c828a95"} Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.400955 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerDied","Data":"ae59ec07927f28e7a40b11d5fff90f71cfb0e05903ac5927ea79312ebddd67b9"} Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.400967 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4","Type":"ContainerDied","Data":"9c4d3a8633f2903987e14b7bfb74e4fe48b0382a1c5583ed3edb50916efd9bdb"} Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.400979 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c4d3a8633f2903987e14b7bfb74e4fe48b0382a1c5583ed3edb50916efd9bdb" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.459438 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.520948 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521034 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-529bp\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-kube-api-access-529bp\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521068 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-db\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521123 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521164 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config-out\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521204 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-tls-assets\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521225 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521263 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-thanos-prometheus-http-client-file\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521283 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-rulefiles-0\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521323 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-secret-combined-ca-bundle\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.521411 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\" (UID: \"a4fa8e27-de6a-4475-b376-c0e8e8d82ad4\") " Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.524136 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-db" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "prometheus-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.526962 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.527029 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.527424 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.528679 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.533692 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-kube-api-access-529bp" (OuterVolumeSpecName: "kube-api-access-529bp") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "kube-api-access-529bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.535642 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config" (OuterVolumeSpecName: "config") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.539584 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config-out" (OuterVolumeSpecName: "config-out") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.542258 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.549144 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623533 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-db\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623578 4707 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config-out\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623593 4707 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-tls-assets\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623604 4707 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623618 4707 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623632 4707 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623645 4707 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623660 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623680 4707 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.623693 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-529bp\" (UniqueName: \"kubernetes.io/projected/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-kube-api-access-529bp\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.639188 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config" (OuterVolumeSpecName: "web-config") pod "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" (UID: "a4fa8e27-de6a-4475-b376-c0e8e8d82ad4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:03:45 crc kubenswrapper[4707]: I1127 17:03:45.725247 4707 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4-web-config\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.439698 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.494815 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.518880 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.531572 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 17:03:46 crc kubenswrapper[4707]: E1127 17:03:46.532075 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="prometheus" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.532094 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="prometheus" Nov 27 17:03:46 crc kubenswrapper[4707]: E1127 17:03:46.532109 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="thanos-sidecar" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.532117 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="thanos-sidecar" Nov 27 17:03:46 crc kubenswrapper[4707]: E1127 17:03:46.532140 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="config-reloader" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.532148 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="config-reloader" Nov 27 17:03:46 crc kubenswrapper[4707]: E1127 17:03:46.532166 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="init-config-reloader" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.532174 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="init-config-reloader" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.532431 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="config-reloader" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.532475 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="prometheus" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.532493 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" containerName="thanos-sidecar" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.551919 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.552049 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.555456 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.555681 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.555801 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.556036 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-fmtlt" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.556188 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.556300 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.565767 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649670 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-config\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649734 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649758 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/eab7447a-6fd9-49e4-8db9-34e357f0c419-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649799 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eab7447a-6fd9-49e4-8db9-34e357f0c419-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649842 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbn87\" (UniqueName: \"kubernetes.io/projected/eab7447a-6fd9-49e4-8db9-34e357f0c419-kube-api-access-lbn87\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649917 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649935 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649959 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649979 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eab7447a-6fd9-49e4-8db9-34e357f0c419-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.649998 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eab7447a-6fd9-49e4-8db9-34e357f0c419-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751380 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eab7447a-6fd9-49e4-8db9-34e357f0c419-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751450 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751486 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbn87\" (UniqueName: \"kubernetes.io/projected/eab7447a-6fd9-49e4-8db9-34e357f0c419-kube-api-access-lbn87\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751548 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751568 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751586 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751604 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eab7447a-6fd9-49e4-8db9-34e357f0c419-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751625 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eab7447a-6fd9-49e4-8db9-34e357f0c419-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751655 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-config\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751710 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.751734 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/eab7447a-6fd9-49e4-8db9-34e357f0c419-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.752102 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/eab7447a-6fd9-49e4-8db9-34e357f0c419-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.752546 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/eab7447a-6fd9-49e4-8db9-34e357f0c419-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.755593 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-config\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.756648 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.756762 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.757701 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.758149 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.758687 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/eab7447a-6fd9-49e4-8db9-34e357f0c419-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.760053 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eab7447a-6fd9-49e4-8db9-34e357f0c419-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.760972 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eab7447a-6fd9-49e4-8db9-34e357f0c419-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.770899 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbn87\" (UniqueName: \"kubernetes.io/projected/eab7447a-6fd9-49e4-8db9-34e357f0c419-kube-api-access-lbn87\") pod \"prometheus-metric-storage-0\" (UID: \"eab7447a-6fd9-49e4-8db9-34e357f0c419\") " pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:46 crc kubenswrapper[4707]: I1127 17:03:46.875391 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 27 17:03:47 crc kubenswrapper[4707]: I1127 17:03:47.211036 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fa8e27-de6a-4475-b376-c0e8e8d82ad4" path="/var/lib/kubelet/pods/a4fa8e27-de6a-4475-b376-c0e8e8d82ad4/volumes" Nov 27 17:03:47 crc kubenswrapper[4707]: I1127 17:03:47.657945 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 27 17:03:48 crc kubenswrapper[4707]: I1127 17:03:48.470988 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eab7447a-6fd9-49e4-8db9-34e357f0c419","Type":"ContainerStarted","Data":"57583b5a8eb8993b702132bc72d809ffc1ec8b7aab0205489f28e5d748a175c0"} Nov 27 17:03:51 crc kubenswrapper[4707]: I1127 17:03:51.417818 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:51 crc kubenswrapper[4707]: I1127 17:03:51.478857 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:51 crc kubenswrapper[4707]: I1127 17:03:51.504956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eab7447a-6fd9-49e4-8db9-34e357f0c419","Type":"ContainerStarted","Data":"d8e356d5239849fbc1859188e0061fa9df98d53d15776f1e53c39e1568719c5e"} Nov 27 17:03:52 crc kubenswrapper[4707]: I1127 17:03:52.178970 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hk9m"] Nov 27 17:03:52 crc kubenswrapper[4707]: I1127 17:03:52.514195 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8hk9m" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="registry-server" containerID="cri-o://176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946" gracePeriod=2 Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.070612 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.191152 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crghg\" (UniqueName: \"kubernetes.io/projected/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-kube-api-access-crghg\") pod \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.191407 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-utilities\") pod \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.191521 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-catalog-content\") pod \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.192221 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-utilities" (OuterVolumeSpecName: "utilities") pod "5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" (UID: "5d6d9704-65c5-489c-ac3e-a8296fdc2f3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.200833 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-kube-api-access-crghg" (OuterVolumeSpecName: "kube-api-access-crghg") pod "5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" (UID: "5d6d9704-65c5-489c-ac3e-a8296fdc2f3a"). InnerVolumeSpecName "kube-api-access-crghg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.293227 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" (UID: "5d6d9704-65c5-489c-ac3e-a8296fdc2f3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.293873 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-catalog-content\") pod \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\" (UID: \"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a\") " Nov 27 17:03:53 crc kubenswrapper[4707]: W1127 17:03:53.293987 4707 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a/volumes/kubernetes.io~empty-dir/catalog-content Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.293999 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" (UID: "5d6d9704-65c5-489c-ac3e-a8296fdc2f3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.295052 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crghg\" (UniqueName: \"kubernetes.io/projected/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-kube-api-access-crghg\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.295070 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.295081 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.539306 4707 generic.go:334] "Generic (PLEG): container finished" podID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerID="176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946" exitCode=0 Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.539488 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hk9m" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.539474 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hk9m" event={"ID":"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a","Type":"ContainerDied","Data":"176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946"} Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.539954 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hk9m" event={"ID":"5d6d9704-65c5-489c-ac3e-a8296fdc2f3a","Type":"ContainerDied","Data":"547b500fe5fe79a28f1c5a40343199979a4038c7ebfbf9b77c9a68a26b3c766d"} Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.539999 4707 scope.go:117] "RemoveContainer" containerID="176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.584297 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hk9m"] Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.593078 4707 scope.go:117] "RemoveContainer" containerID="0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.595554 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8hk9m"] Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.635791 4707 scope.go:117] "RemoveContainer" containerID="a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.692387 4707 scope.go:117] "RemoveContainer" containerID="176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946" Nov 27 17:03:53 crc kubenswrapper[4707]: E1127 17:03:53.692922 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946\": container with ID starting with 176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946 not found: ID does not exist" containerID="176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.692981 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946"} err="failed to get container status \"176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946\": rpc error: code = NotFound desc = could not find container \"176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946\": container with ID starting with 176cdddf1e211c69102e52596911d694df01e55e42454f3bcc8fe303c4824946 not found: ID does not exist" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.693009 4707 scope.go:117] "RemoveContainer" containerID="0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe" Nov 27 17:03:53 crc kubenswrapper[4707]: E1127 17:03:53.693628 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe\": container with ID starting with 0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe not found: ID does not exist" containerID="0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.693662 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe"} err="failed to get container status \"0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe\": rpc error: code = NotFound desc = could not find container \"0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe\": container with ID starting with 0eae15b003fb6fc0a3b10d14906039690871f73f721784c378739def579eecfe not found: ID does not exist" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.693682 4707 scope.go:117] "RemoveContainer" containerID="a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413" Nov 27 17:03:53 crc kubenswrapper[4707]: E1127 17:03:53.694109 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413\": container with ID starting with a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413 not found: ID does not exist" containerID="a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413" Nov 27 17:03:53 crc kubenswrapper[4707]: I1127 17:03:53.694243 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413"} err="failed to get container status \"a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413\": rpc error: code = NotFound desc = could not find container \"a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413\": container with ID starting with a07fb1a977815a331cd35a1e52195b8db9f600b3c0d47bb339fb1fe7db3c7413 not found: ID does not exist" Nov 27 17:03:55 crc kubenswrapper[4707]: I1127 17:03:55.215121 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" path="/var/lib/kubelet/pods/5d6d9704-65c5-489c-ac3e-a8296fdc2f3a/volumes" Nov 27 17:04:01 crc kubenswrapper[4707]: I1127 17:04:01.655556 4707 generic.go:334] "Generic (PLEG): container finished" podID="eab7447a-6fd9-49e4-8db9-34e357f0c419" containerID="d8e356d5239849fbc1859188e0061fa9df98d53d15776f1e53c39e1568719c5e" exitCode=0 Nov 27 17:04:01 crc kubenswrapper[4707]: I1127 17:04:01.655632 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eab7447a-6fd9-49e4-8db9-34e357f0c419","Type":"ContainerDied","Data":"d8e356d5239849fbc1859188e0061fa9df98d53d15776f1e53c39e1568719c5e"} Nov 27 17:04:02 crc kubenswrapper[4707]: I1127 17:04:02.668822 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eab7447a-6fd9-49e4-8db9-34e357f0c419","Type":"ContainerStarted","Data":"771267ddf921a2372142b22533dca371a5a2cede645ecfea2ef3ed1526214bc3"} Nov 27 17:04:06 crc kubenswrapper[4707]: I1127 17:04:06.711757 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eab7447a-6fd9-49e4-8db9-34e357f0c419","Type":"ContainerStarted","Data":"6061dc01b19b33d4dced613800b258f2c1f0b3a52a15b91c00d7827202a5eac6"} Nov 27 17:04:06 crc kubenswrapper[4707]: I1127 17:04:06.712355 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"eab7447a-6fd9-49e4-8db9-34e357f0c419","Type":"ContainerStarted","Data":"540bdf624cab198721ba1374b23cb9cfcbd4c4b97b57ee49b0cfcbd21962e107"} Nov 27 17:04:06 crc kubenswrapper[4707]: I1127 17:04:06.748942 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.748905127 podStartE2EDuration="20.748905127s" podCreationTimestamp="2025-11-27 17:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:04:06.740088017 +0000 UTC m=+3622.371536795" watchObservedRunningTime="2025-11-27 17:04:06.748905127 +0000 UTC m=+3622.380353945" Nov 27 17:04:06 crc kubenswrapper[4707]: I1127 17:04:06.876316 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 27 17:04:16 crc kubenswrapper[4707]: I1127 17:04:16.876652 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 27 17:04:16 crc kubenswrapper[4707]: I1127 17:04:16.882566 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 27 17:04:17 crc kubenswrapper[4707]: I1127 17:04:17.837957 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 27 17:04:33 crc kubenswrapper[4707]: I1127 17:04:33.623837 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:04:33 crc kubenswrapper[4707]: I1127 17:04:33.624434 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:05:03 crc kubenswrapper[4707]: I1127 17:05:03.624331 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:05:03 crc kubenswrapper[4707]: I1127 17:05:03.625088 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.099657 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d7wf8"] Nov 27 17:05:13 crc kubenswrapper[4707]: E1127 17:05:13.100595 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="extract-content" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.100610 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="extract-content" Nov 27 17:05:13 crc kubenswrapper[4707]: E1127 17:05:13.100650 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="extract-utilities" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.100660 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="extract-utilities" Nov 27 17:05:13 crc kubenswrapper[4707]: E1127 17:05:13.100677 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="registry-server" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.100687 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="registry-server" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.100915 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d6d9704-65c5-489c-ac3e-a8296fdc2f3a" containerName="registry-server" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.102670 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.120182 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d7wf8"] Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.216130 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-utilities\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.216336 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-catalog-content\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.216518 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n245s\" (UniqueName: \"kubernetes.io/projected/d7e25e91-45af-4090-81ed-c43b14689af3-kube-api-access-n245s\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.318767 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n245s\" (UniqueName: \"kubernetes.io/projected/d7e25e91-45af-4090-81ed-c43b14689af3-kube-api-access-n245s\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.318836 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-utilities\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.318928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-catalog-content\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.319400 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-catalog-content\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.319426 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-utilities\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.337600 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n245s\" (UniqueName: \"kubernetes.io/projected/d7e25e91-45af-4090-81ed-c43b14689af3-kube-api-access-n245s\") pod \"community-operators-d7wf8\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.438822 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:13 crc kubenswrapper[4707]: I1127 17:05:13.988561 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d7wf8"] Nov 27 17:05:14 crc kubenswrapper[4707]: E1127 17:05:14.386615 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e25e91_45af_4090_81ed_c43b14689af3.slice/crio-conmon-651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351.scope\": RecentStats: unable to find data in memory cache]" Nov 27 17:05:14 crc kubenswrapper[4707]: I1127 17:05:14.470060 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7e25e91-45af-4090-81ed-c43b14689af3" containerID="651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351" exitCode=0 Nov 27 17:05:14 crc kubenswrapper[4707]: I1127 17:05:14.470158 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7wf8" event={"ID":"d7e25e91-45af-4090-81ed-c43b14689af3","Type":"ContainerDied","Data":"651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351"} Nov 27 17:05:14 crc kubenswrapper[4707]: I1127 17:05:14.470394 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7wf8" event={"ID":"d7e25e91-45af-4090-81ed-c43b14689af3","Type":"ContainerStarted","Data":"b616233ce69ccf8a9c1c9d78b0b41fe204aab908291ca65e31053ceb4a93e3d6"} Nov 27 17:05:16 crc kubenswrapper[4707]: I1127 17:05:16.493352 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7wf8" event={"ID":"d7e25e91-45af-4090-81ed-c43b14689af3","Type":"ContainerStarted","Data":"2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286"} Nov 27 17:05:17 crc kubenswrapper[4707]: I1127 17:05:17.505514 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7e25e91-45af-4090-81ed-c43b14689af3" containerID="2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286" exitCode=0 Nov 27 17:05:17 crc kubenswrapper[4707]: I1127 17:05:17.505608 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7wf8" event={"ID":"d7e25e91-45af-4090-81ed-c43b14689af3","Type":"ContainerDied","Data":"2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286"} Nov 27 17:05:19 crc kubenswrapper[4707]: I1127 17:05:19.529554 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7wf8" event={"ID":"d7e25e91-45af-4090-81ed-c43b14689af3","Type":"ContainerStarted","Data":"17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552"} Nov 27 17:05:19 crc kubenswrapper[4707]: I1127 17:05:19.548570 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d7wf8" podStartSLOduration=2.614479772 podStartE2EDuration="6.548543437s" podCreationTimestamp="2025-11-27 17:05:13 +0000 UTC" firstStartedPulling="2025-11-27 17:05:14.472563463 +0000 UTC m=+3690.104012231" lastFinishedPulling="2025-11-27 17:05:18.406627078 +0000 UTC m=+3694.038075896" observedRunningTime="2025-11-27 17:05:19.543961234 +0000 UTC m=+3695.175410002" watchObservedRunningTime="2025-11-27 17:05:19.548543437 +0000 UTC m=+3695.179992215" Nov 27 17:05:23 crc kubenswrapper[4707]: I1127 17:05:23.439982 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:23 crc kubenswrapper[4707]: I1127 17:05:23.440062 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:23 crc kubenswrapper[4707]: I1127 17:05:23.517040 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:23 crc kubenswrapper[4707]: I1127 17:05:23.641284 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:23 crc kubenswrapper[4707]: I1127 17:05:23.767033 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d7wf8"] Nov 27 17:05:25 crc kubenswrapper[4707]: I1127 17:05:25.602271 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d7wf8" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" containerName="registry-server" containerID="cri-o://17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552" gracePeriod=2 Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.136262 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.317696 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-utilities\") pod \"d7e25e91-45af-4090-81ed-c43b14689af3\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.317775 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-catalog-content\") pod \"d7e25e91-45af-4090-81ed-c43b14689af3\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.317919 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n245s\" (UniqueName: \"kubernetes.io/projected/d7e25e91-45af-4090-81ed-c43b14689af3-kube-api-access-n245s\") pod \"d7e25e91-45af-4090-81ed-c43b14689af3\" (UID: \"d7e25e91-45af-4090-81ed-c43b14689af3\") " Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.319713 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-utilities" (OuterVolumeSpecName: "utilities") pod "d7e25e91-45af-4090-81ed-c43b14689af3" (UID: "d7e25e91-45af-4090-81ed-c43b14689af3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.326280 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e25e91-45af-4090-81ed-c43b14689af3-kube-api-access-n245s" (OuterVolumeSpecName: "kube-api-access-n245s") pod "d7e25e91-45af-4090-81ed-c43b14689af3" (UID: "d7e25e91-45af-4090-81ed-c43b14689af3"). InnerVolumeSpecName "kube-api-access-n245s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.385239 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7e25e91-45af-4090-81ed-c43b14689af3" (UID: "d7e25e91-45af-4090-81ed-c43b14689af3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.425103 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.425234 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e25e91-45af-4090-81ed-c43b14689af3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.425310 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n245s\" (UniqueName: \"kubernetes.io/projected/d7e25e91-45af-4090-81ed-c43b14689af3-kube-api-access-n245s\") on node \"crc\" DevicePath \"\"" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.616643 4707 generic.go:334] "Generic (PLEG): container finished" podID="d7e25e91-45af-4090-81ed-c43b14689af3" containerID="17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552" exitCode=0 Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.616699 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7wf8" event={"ID":"d7e25e91-45af-4090-81ed-c43b14689af3","Type":"ContainerDied","Data":"17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552"} Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.616737 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7wf8" event={"ID":"d7e25e91-45af-4090-81ed-c43b14689af3","Type":"ContainerDied","Data":"b616233ce69ccf8a9c1c9d78b0b41fe204aab908291ca65e31053ceb4a93e3d6"} Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.616786 4707 scope.go:117] "RemoveContainer" containerID="17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.616807 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7wf8" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.653180 4707 scope.go:117] "RemoveContainer" containerID="2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.661380 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d7wf8"] Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.675897 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d7wf8"] Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.682972 4707 scope.go:117] "RemoveContainer" containerID="651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.728773 4707 scope.go:117] "RemoveContainer" containerID="17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552" Nov 27 17:05:26 crc kubenswrapper[4707]: E1127 17:05:26.730530 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552\": container with ID starting with 17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552 not found: ID does not exist" containerID="17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.730599 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552"} err="failed to get container status \"17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552\": rpc error: code = NotFound desc = could not find container \"17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552\": container with ID starting with 17f29f549398d61c57db2877db498415ed51af1d59d2c60b18c33f493a70c552 not found: ID does not exist" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.730649 4707 scope.go:117] "RemoveContainer" containerID="2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286" Nov 27 17:05:26 crc kubenswrapper[4707]: E1127 17:05:26.731086 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286\": container with ID starting with 2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286 not found: ID does not exist" containerID="2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.731113 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286"} err="failed to get container status \"2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286\": rpc error: code = NotFound desc = could not find container \"2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286\": container with ID starting with 2bfecebff5bd55b7e1bb6e8d41840033f182c068259fb3448df3c86c059d9286 not found: ID does not exist" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.731132 4707 scope.go:117] "RemoveContainer" containerID="651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351" Nov 27 17:05:26 crc kubenswrapper[4707]: E1127 17:05:26.731420 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351\": container with ID starting with 651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351 not found: ID does not exist" containerID="651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351" Nov 27 17:05:26 crc kubenswrapper[4707]: I1127 17:05:26.731447 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351"} err="failed to get container status \"651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351\": rpc error: code = NotFound desc = could not find container \"651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351\": container with ID starting with 651483cfffee59085cc0e708c502fb082c7f4085aebcb6fa5af8f8ed20ca0351 not found: ID does not exist" Nov 27 17:05:27 crc kubenswrapper[4707]: I1127 17:05:27.211164 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" path="/var/lib/kubelet/pods/d7e25e91-45af-4090-81ed-c43b14689af3/volumes" Nov 27 17:05:33 crc kubenswrapper[4707]: I1127 17:05:33.623803 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:05:33 crc kubenswrapper[4707]: I1127 17:05:33.624827 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:05:33 crc kubenswrapper[4707]: I1127 17:05:33.624898 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 17:05:33 crc kubenswrapper[4707]: I1127 17:05:33.626211 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:05:33 crc kubenswrapper[4707]: I1127 17:05:33.626319 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" gracePeriod=600 Nov 27 17:05:33 crc kubenswrapper[4707]: E1127 17:05:33.752925 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:05:34 crc kubenswrapper[4707]: I1127 17:05:34.708989 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" exitCode=0 Nov 27 17:05:34 crc kubenswrapper[4707]: I1127 17:05:34.709030 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c"} Nov 27 17:05:34 crc kubenswrapper[4707]: I1127 17:05:34.709086 4707 scope.go:117] "RemoveContainer" containerID="abc6bd7ef76faec628dfd8360ab3a8cc28efd5faac28531e1a0d9c9576ea3f86" Nov 27 17:05:34 crc kubenswrapper[4707]: I1127 17:05:34.709946 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:05:34 crc kubenswrapper[4707]: E1127 17:05:34.710439 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:05:44 crc kubenswrapper[4707]: I1127 17:05:44.261182 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/manager/0.log" Nov 27 17:05:46 crc kubenswrapper[4707]: I1127 17:05:46.195651 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:05:46 crc kubenswrapper[4707]: E1127 17:05:46.196424 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:05:59 crc kubenswrapper[4707]: I1127 17:05:59.195183 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:05:59 crc kubenswrapper[4707]: E1127 17:05:59.195854 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.207563 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bhqp9/must-gather-t2zcf"] Nov 27 17:06:03 crc kubenswrapper[4707]: E1127 17:06:03.208379 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" containerName="extract-content" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.208392 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" containerName="extract-content" Nov 27 17:06:03 crc kubenswrapper[4707]: E1127 17:06:03.208403 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" containerName="registry-server" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.208409 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" containerName="registry-server" Nov 27 17:06:03 crc kubenswrapper[4707]: E1127 17:06:03.208425 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" containerName="extract-utilities" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.208432 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" containerName="extract-utilities" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.208642 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e25e91-45af-4090-81ed-c43b14689af3" containerName="registry-server" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.210070 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bhqp9/must-gather-t2zcf"] Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.210180 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.214135 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bhqp9"/"openshift-service-ca.crt" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.214344 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bhqp9"/"kube-root-ca.crt" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.214713 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bhqp9"/"default-dockercfg-9n4lf" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.326010 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-must-gather-output\") pod \"must-gather-t2zcf\" (UID: \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\") " pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.327204 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmmjp\" (UniqueName: \"kubernetes.io/projected/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-kube-api-access-pmmjp\") pod \"must-gather-t2zcf\" (UID: \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\") " pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.429092 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmmjp\" (UniqueName: \"kubernetes.io/projected/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-kube-api-access-pmmjp\") pod \"must-gather-t2zcf\" (UID: \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\") " pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.429250 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-must-gather-output\") pod \"must-gather-t2zcf\" (UID: \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\") " pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.430148 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-must-gather-output\") pod \"must-gather-t2zcf\" (UID: \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\") " pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.450183 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmmjp\" (UniqueName: \"kubernetes.io/projected/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-kube-api-access-pmmjp\") pod \"must-gather-t2zcf\" (UID: \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\") " pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:06:03 crc kubenswrapper[4707]: I1127 17:06:03.530715 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:06:04 crc kubenswrapper[4707]: I1127 17:06:04.112089 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bhqp9/must-gather-t2zcf"] Nov 27 17:06:04 crc kubenswrapper[4707]: I1127 17:06:04.131720 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:06:05 crc kubenswrapper[4707]: I1127 17:06:05.089756 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" event={"ID":"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd","Type":"ContainerStarted","Data":"420ede41af57ca865b2608fdb20fe333b2649a3a084a0aeccfadac35fb7c2472"} Nov 27 17:06:11 crc kubenswrapper[4707]: I1127 17:06:11.166424 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" event={"ID":"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd","Type":"ContainerStarted","Data":"566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f"} Nov 27 17:06:11 crc kubenswrapper[4707]: I1127 17:06:11.168213 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" event={"ID":"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd","Type":"ContainerStarted","Data":"de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74"} Nov 27 17:06:11 crc kubenswrapper[4707]: I1127 17:06:11.194291 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" podStartSLOduration=1.892982955 podStartE2EDuration="8.194270116s" podCreationTimestamp="2025-11-27 17:06:03 +0000 UTC" firstStartedPulling="2025-11-27 17:06:04.131432612 +0000 UTC m=+3739.762881380" lastFinishedPulling="2025-11-27 17:06:10.432719773 +0000 UTC m=+3746.064168541" observedRunningTime="2025-11-27 17:06:11.187221523 +0000 UTC m=+3746.818670291" watchObservedRunningTime="2025-11-27 17:06:11.194270116 +0000 UTC m=+3746.825718884" Nov 27 17:06:13 crc kubenswrapper[4707]: I1127 17:06:13.195098 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:06:13 crc kubenswrapper[4707]: E1127 17:06:13.195768 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:06:15 crc kubenswrapper[4707]: I1127 17:06:15.208839 4707 scope.go:117] "RemoveContainer" containerID="ae59ec07927f28e7a40b11d5fff90f71cfb0e05903ac5927ea79312ebddd67b9" Nov 27 17:06:15 crc kubenswrapper[4707]: I1127 17:06:15.238479 4707 scope.go:117] "RemoveContainer" containerID="ea67e6957ec36272d33a3773f89596ac9880bf42fe7044403eee87d61c828a95" Nov 27 17:06:15 crc kubenswrapper[4707]: I1127 17:06:15.260209 4707 scope.go:117] "RemoveContainer" containerID="1aee67f5cd4117fcdf0b27d0fa8a145b51466f56406e4bf42cba527ea1f87e2d" Nov 27 17:06:15 crc kubenswrapper[4707]: I1127 17:06:15.291741 4707 scope.go:117] "RemoveContainer" containerID="7aabd4fd4205c7083902d2bdf70c45a7e25d886d6ae0383d1ec414e6f4d6ea07" Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.382396 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bhqp9/crc-debug-m77jv"] Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.384541 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.493361 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcd3addd-6736-466b-8b22-e71424af1fb5-host\") pod \"crc-debug-m77jv\" (UID: \"fcd3addd-6736-466b-8b22-e71424af1fb5\") " pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.493635 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2j8z\" (UniqueName: \"kubernetes.io/projected/fcd3addd-6736-466b-8b22-e71424af1fb5-kube-api-access-t2j8z\") pod \"crc-debug-m77jv\" (UID: \"fcd3addd-6736-466b-8b22-e71424af1fb5\") " pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.595739 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcd3addd-6736-466b-8b22-e71424af1fb5-host\") pod \"crc-debug-m77jv\" (UID: \"fcd3addd-6736-466b-8b22-e71424af1fb5\") " pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.595800 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2j8z\" (UniqueName: \"kubernetes.io/projected/fcd3addd-6736-466b-8b22-e71424af1fb5-kube-api-access-t2j8z\") pod \"crc-debug-m77jv\" (UID: \"fcd3addd-6736-466b-8b22-e71424af1fb5\") " pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.596279 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcd3addd-6736-466b-8b22-e71424af1fb5-host\") pod \"crc-debug-m77jv\" (UID: \"fcd3addd-6736-466b-8b22-e71424af1fb5\") " pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.614746 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2j8z\" (UniqueName: \"kubernetes.io/projected/fcd3addd-6736-466b-8b22-e71424af1fb5-kube-api-access-t2j8z\") pod \"crc-debug-m77jv\" (UID: \"fcd3addd-6736-466b-8b22-e71424af1fb5\") " pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:19 crc kubenswrapper[4707]: I1127 17:06:19.705686 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:19 crc kubenswrapper[4707]: W1127 17:06:19.749071 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcd3addd_6736_466b_8b22_e71424af1fb5.slice/crio-83b001e56b7e4e2e7b9c364e1bb7ba41cdb839a0bbb74e2d11ff8c656ccbcaf0 WatchSource:0}: Error finding container 83b001e56b7e4e2e7b9c364e1bb7ba41cdb839a0bbb74e2d11ff8c656ccbcaf0: Status 404 returned error can't find the container with id 83b001e56b7e4e2e7b9c364e1bb7ba41cdb839a0bbb74e2d11ff8c656ccbcaf0 Nov 27 17:06:20 crc kubenswrapper[4707]: I1127 17:06:20.276250 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/crc-debug-m77jv" event={"ID":"fcd3addd-6736-466b-8b22-e71424af1fb5","Type":"ContainerStarted","Data":"83b001e56b7e4e2e7b9c364e1bb7ba41cdb839a0bbb74e2d11ff8c656ccbcaf0"} Nov 27 17:06:25 crc kubenswrapper[4707]: I1127 17:06:25.203514 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:06:25 crc kubenswrapper[4707]: E1127 17:06:25.204462 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:06:31 crc kubenswrapper[4707]: I1127 17:06:31.382266 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/crc-debug-m77jv" event={"ID":"fcd3addd-6736-466b-8b22-e71424af1fb5","Type":"ContainerStarted","Data":"9287b1eb3cdc0b2ca6c4589583eb166c5dba968e8d16bbaf0c59252b8a764009"} Nov 27 17:06:31 crc kubenswrapper[4707]: I1127 17:06:31.400894 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bhqp9/crc-debug-m77jv" podStartSLOduration=1.441199409 podStartE2EDuration="12.400869874s" podCreationTimestamp="2025-11-27 17:06:19 +0000 UTC" firstStartedPulling="2025-11-27 17:06:19.750956937 +0000 UTC m=+3755.382405705" lastFinishedPulling="2025-11-27 17:06:30.710627402 +0000 UTC m=+3766.342076170" observedRunningTime="2025-11-27 17:06:31.393735409 +0000 UTC m=+3767.025184177" watchObservedRunningTime="2025-11-27 17:06:31.400869874 +0000 UTC m=+3767.032318642" Nov 27 17:06:37 crc kubenswrapper[4707]: I1127 17:06:37.195846 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:06:37 crc kubenswrapper[4707]: E1127 17:06:37.196695 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:06:48 crc kubenswrapper[4707]: I1127 17:06:48.569991 4707 generic.go:334] "Generic (PLEG): container finished" podID="fcd3addd-6736-466b-8b22-e71424af1fb5" containerID="9287b1eb3cdc0b2ca6c4589583eb166c5dba968e8d16bbaf0c59252b8a764009" exitCode=0 Nov 27 17:06:48 crc kubenswrapper[4707]: I1127 17:06:48.570089 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/crc-debug-m77jv" event={"ID":"fcd3addd-6736-466b-8b22-e71424af1fb5","Type":"ContainerDied","Data":"9287b1eb3cdc0b2ca6c4589583eb166c5dba968e8d16bbaf0c59252b8a764009"} Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.754094 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.797763 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bhqp9/crc-debug-m77jv"] Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.807965 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bhqp9/crc-debug-m77jv"] Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.829006 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2j8z\" (UniqueName: \"kubernetes.io/projected/fcd3addd-6736-466b-8b22-e71424af1fb5-kube-api-access-t2j8z\") pod \"fcd3addd-6736-466b-8b22-e71424af1fb5\" (UID: \"fcd3addd-6736-466b-8b22-e71424af1fb5\") " Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.829159 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcd3addd-6736-466b-8b22-e71424af1fb5-host\") pod \"fcd3addd-6736-466b-8b22-e71424af1fb5\" (UID: \"fcd3addd-6736-466b-8b22-e71424af1fb5\") " Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.829249 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcd3addd-6736-466b-8b22-e71424af1fb5-host" (OuterVolumeSpecName: "host") pod "fcd3addd-6736-466b-8b22-e71424af1fb5" (UID: "fcd3addd-6736-466b-8b22-e71424af1fb5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.829789 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcd3addd-6736-466b-8b22-e71424af1fb5-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.835141 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd3addd-6736-466b-8b22-e71424af1fb5-kube-api-access-t2j8z" (OuterVolumeSpecName: "kube-api-access-t2j8z") pod "fcd3addd-6736-466b-8b22-e71424af1fb5" (UID: "fcd3addd-6736-466b-8b22-e71424af1fb5"). InnerVolumeSpecName "kube-api-access-t2j8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:06:49 crc kubenswrapper[4707]: I1127 17:06:49.931139 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2j8z\" (UniqueName: \"kubernetes.io/projected/fcd3addd-6736-466b-8b22-e71424af1fb5-kube-api-access-t2j8z\") on node \"crc\" DevicePath \"\"" Nov 27 17:06:50 crc kubenswrapper[4707]: I1127 17:06:50.594109 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b001e56b7e4e2e7b9c364e1bb7ba41cdb839a0bbb74e2d11ff8c656ccbcaf0" Nov 27 17:06:50 crc kubenswrapper[4707]: I1127 17:06:50.594647 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/crc-debug-m77jv" Nov 27 17:06:50 crc kubenswrapper[4707]: I1127 17:06:50.989251 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bhqp9/crc-debug-g74lj"] Nov 27 17:06:50 crc kubenswrapper[4707]: E1127 17:06:50.989773 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd3addd-6736-466b-8b22-e71424af1fb5" containerName="container-00" Nov 27 17:06:50 crc kubenswrapper[4707]: I1127 17:06:50.989790 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd3addd-6736-466b-8b22-e71424af1fb5" containerName="container-00" Nov 27 17:06:50 crc kubenswrapper[4707]: I1127 17:06:50.989978 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd3addd-6736-466b-8b22-e71424af1fb5" containerName="container-00" Nov 27 17:06:50 crc kubenswrapper[4707]: I1127 17:06:50.990813 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.054432 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52c212b6-24e6-48a9-b435-43dad584eac8-host\") pod \"crc-debug-g74lj\" (UID: \"52c212b6-24e6-48a9-b435-43dad584eac8\") " pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.054600 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj49c\" (UniqueName: \"kubernetes.io/projected/52c212b6-24e6-48a9-b435-43dad584eac8-kube-api-access-vj49c\") pod \"crc-debug-g74lj\" (UID: \"52c212b6-24e6-48a9-b435-43dad584eac8\") " pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.156518 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj49c\" (UniqueName: \"kubernetes.io/projected/52c212b6-24e6-48a9-b435-43dad584eac8-kube-api-access-vj49c\") pod \"crc-debug-g74lj\" (UID: \"52c212b6-24e6-48a9-b435-43dad584eac8\") " pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.157221 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52c212b6-24e6-48a9-b435-43dad584eac8-host\") pod \"crc-debug-g74lj\" (UID: \"52c212b6-24e6-48a9-b435-43dad584eac8\") " pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.157337 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52c212b6-24e6-48a9-b435-43dad584eac8-host\") pod \"crc-debug-g74lj\" (UID: \"52c212b6-24e6-48a9-b435-43dad584eac8\") " pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.194972 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:06:51 crc kubenswrapper[4707]: E1127 17:06:51.195353 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.208895 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd3addd-6736-466b-8b22-e71424af1fb5" path="/var/lib/kubelet/pods/fcd3addd-6736-466b-8b22-e71424af1fb5/volumes" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.259882 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj49c\" (UniqueName: \"kubernetes.io/projected/52c212b6-24e6-48a9-b435-43dad584eac8-kube-api-access-vj49c\") pod \"crc-debug-g74lj\" (UID: \"52c212b6-24e6-48a9-b435-43dad584eac8\") " pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.305589 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:51 crc kubenswrapper[4707]: W1127 17:06:51.347203 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52c212b6_24e6_48a9_b435_43dad584eac8.slice/crio-27e56c34e9bdf078c43b452ffd7e12fc54d9bd72192a8161042027741790ec56 WatchSource:0}: Error finding container 27e56c34e9bdf078c43b452ffd7e12fc54d9bd72192a8161042027741790ec56: Status 404 returned error can't find the container with id 27e56c34e9bdf078c43b452ffd7e12fc54d9bd72192a8161042027741790ec56 Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.603314 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/crc-debug-g74lj" event={"ID":"52c212b6-24e6-48a9-b435-43dad584eac8","Type":"ContainerStarted","Data":"0c6b488dcd151c1e1a8fd61a27f1eed56a0d49c1357fa60c84e497854fe9f0a0"} Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.603685 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/crc-debug-g74lj" event={"ID":"52c212b6-24e6-48a9-b435-43dad584eac8","Type":"ContainerStarted","Data":"27e56c34e9bdf078c43b452ffd7e12fc54d9bd72192a8161042027741790ec56"} Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.644168 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bhqp9/crc-debug-g74lj"] Nov 27 17:06:51 crc kubenswrapper[4707]: I1127 17:06:51.654904 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bhqp9/crc-debug-g74lj"] Nov 27 17:06:52 crc kubenswrapper[4707]: I1127 17:06:52.614286 4707 generic.go:334] "Generic (PLEG): container finished" podID="52c212b6-24e6-48a9-b435-43dad584eac8" containerID="0c6b488dcd151c1e1a8fd61a27f1eed56a0d49c1357fa60c84e497854fe9f0a0" exitCode=1 Nov 27 17:06:52 crc kubenswrapper[4707]: I1127 17:06:52.709372 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:06:52 crc kubenswrapper[4707]: I1127 17:06:52.786716 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj49c\" (UniqueName: \"kubernetes.io/projected/52c212b6-24e6-48a9-b435-43dad584eac8-kube-api-access-vj49c\") pod \"52c212b6-24e6-48a9-b435-43dad584eac8\" (UID: \"52c212b6-24e6-48a9-b435-43dad584eac8\") " Nov 27 17:06:52 crc kubenswrapper[4707]: I1127 17:06:52.787026 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52c212b6-24e6-48a9-b435-43dad584eac8-host\") pod \"52c212b6-24e6-48a9-b435-43dad584eac8\" (UID: \"52c212b6-24e6-48a9-b435-43dad584eac8\") " Nov 27 17:06:52 crc kubenswrapper[4707]: I1127 17:06:52.787118 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52c212b6-24e6-48a9-b435-43dad584eac8-host" (OuterVolumeSpecName: "host") pod "52c212b6-24e6-48a9-b435-43dad584eac8" (UID: "52c212b6-24e6-48a9-b435-43dad584eac8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:06:52 crc kubenswrapper[4707]: I1127 17:06:52.787546 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52c212b6-24e6-48a9-b435-43dad584eac8-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:06:52 crc kubenswrapper[4707]: I1127 17:06:52.793602 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c212b6-24e6-48a9-b435-43dad584eac8-kube-api-access-vj49c" (OuterVolumeSpecName: "kube-api-access-vj49c") pod "52c212b6-24e6-48a9-b435-43dad584eac8" (UID: "52c212b6-24e6-48a9-b435-43dad584eac8"). InnerVolumeSpecName "kube-api-access-vj49c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:06:52 crc kubenswrapper[4707]: I1127 17:06:52.889833 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj49c\" (UniqueName: \"kubernetes.io/projected/52c212b6-24e6-48a9-b435-43dad584eac8-kube-api-access-vj49c\") on node \"crc\" DevicePath \"\"" Nov 27 17:06:53 crc kubenswrapper[4707]: I1127 17:06:53.207774 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c212b6-24e6-48a9-b435-43dad584eac8" path="/var/lib/kubelet/pods/52c212b6-24e6-48a9-b435-43dad584eac8/volumes" Nov 27 17:06:53 crc kubenswrapper[4707]: I1127 17:06:53.625642 4707 scope.go:117] "RemoveContainer" containerID="0c6b488dcd151c1e1a8fd61a27f1eed56a0d49c1357fa60c84e497854fe9f0a0" Nov 27 17:06:53 crc kubenswrapper[4707]: I1127 17:06:53.625671 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/crc-debug-g74lj" Nov 27 17:07:04 crc kubenswrapper[4707]: I1127 17:07:04.195259 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:07:04 crc kubenswrapper[4707]: E1127 17:07:04.196827 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:07:18 crc kubenswrapper[4707]: I1127 17:07:18.196513 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:07:18 crc kubenswrapper[4707]: E1127 17:07:18.197284 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:07:33 crc kubenswrapper[4707]: I1127 17:07:33.195981 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:07:33 crc kubenswrapper[4707]: E1127 17:07:33.204228 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:07:42 crc kubenswrapper[4707]: I1127 17:07:42.787841 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b6ba41a2-cc99-4242-be96-b249bc657b2f/init-config-reloader/0.log" Nov 27 17:07:42 crc kubenswrapper[4707]: I1127 17:07:42.957204 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b6ba41a2-cc99-4242-be96-b249bc657b2f/init-config-reloader/0.log" Nov 27 17:07:42 crc kubenswrapper[4707]: I1127 17:07:42.985405 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b6ba41a2-cc99-4242-be96-b249bc657b2f/alertmanager/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.019891 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b6ba41a2-cc99-4242-be96-b249bc657b2f/config-reloader/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.167290 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6282f04c-ddbf-46d2-a5ac-ba7550ff2559/aodh-api/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.201225 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6282f04c-ddbf-46d2-a5ac-ba7550ff2559/aodh-evaluator/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.232941 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6282f04c-ddbf-46d2-a5ac-ba7550ff2559/aodh-listener/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.314716 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6282f04c-ddbf-46d2-a5ac-ba7550ff2559/aodh-notifier/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.421721 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c695745c6-j5ntf_1050304d-e51e-4b02-9cec-828bb7d406bf/barbican-api/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.437763 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c695745c6-j5ntf_1050304d-e51e-4b02-9cec-828bb7d406bf/barbican-api-log/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.639522 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c948b485d-2wcq8_d1df02f7-1e71-4fae-a6df-cb3c83460a7e/barbican-keystone-listener/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.691215 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c948b485d-2wcq8_d1df02f7-1e71-4fae-a6df-cb3c83460a7e/barbican-keystone-listener-log/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.800430 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-567875797-q64rz_e1955b85-6ed8-492c-8001-c4fc20da8270/barbican-worker/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.877253 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-567875797-q64rz_e1955b85-6ed8-492c-8001-c4fc20da8270/barbican-worker-log/0.log" Nov 27 17:07:43 crc kubenswrapper[4707]: I1127 17:07:43.950843 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh_a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.096282 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_907095f4-0cd3-4e69-8f3f-fa908be6b6d0/ceilometer-central-agent/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.139405 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_907095f4-0cd3-4e69-8f3f-fa908be6b6d0/ceilometer-notification-agent/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.168538 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_907095f4-0cd3-4e69-8f3f-fa908be6b6d0/proxy-httpd/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.272023 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_907095f4-0cd3-4e69-8f3f-fa908be6b6d0/sg-core/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.581302 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_98b666a0-7de5-45af-b604-c6fa48371681/cinder-api/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.657941 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_98b666a0-7de5-45af-b604-c6fa48371681/cinder-api-log/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.773067 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7198c845-6481-4a99-b508-b3da40447ba6/cinder-scheduler/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.876969 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7198c845-6481-4a99-b508-b3da40447ba6/probe/0.log" Nov 27 17:07:44 crc kubenswrapper[4707]: I1127 17:07:44.958975 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs_86c589a6-19e5-48cc-8db8-42af5ae0f078/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.084148 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-4kq59_26ad523c-9a7e-437b-a8e5-1b72a0a90d19/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.206754 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-pxkhr_3f07f078-fb9b-425d-9575-520a406e4178/init/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.375277 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-pxkhr_3f07f078-fb9b-425d-9575-520a406e4178/init/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.398724 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-pxkhr_3f07f078-fb9b-425d-9575-520a406e4178/dnsmasq-dns/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.513872 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl_540a037b-eddb-4f11-8ed0-209cebfc0ee1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.618954 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f4ece211-479a-4f06-bc88-b9e50c0671f4/glance-httpd/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.686931 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f4ece211-479a-4f06-bc88-b9e50c0671f4/glance-log/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.830259 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_24849992-202f-4439-ac0d-241724235be4/glance-httpd/0.log" Nov 27 17:07:45 crc kubenswrapper[4707]: I1127 17:07:45.852738 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_24849992-202f-4439-ac0d-241724235be4/glance-log/0.log" Nov 27 17:07:46 crc kubenswrapper[4707]: I1127 17:07:46.255041 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-57bc8fcfc9-trdbf_c001365b-7c18-4d58-b516-a038ef2d6c8c/heat-api/0.log" Nov 27 17:07:46 crc kubenswrapper[4707]: I1127 17:07:46.416199 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-7597fbc9fb-5l66n_b92f69c5-3b78-463b-bb1a-7728d2cdb6ff/heat-engine/0.log" Nov 27 17:07:46 crc kubenswrapper[4707]: I1127 17:07:46.538055 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5c4f76f9fb-ghh99_7646872e-82fb-4df8-b7ce-176b3ba7fe8a/heat-cfnapi/0.log" Nov 27 17:07:46 crc kubenswrapper[4707]: I1127 17:07:46.973260 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-wgj2d_f3e34a79-7842-4f97-91f4-040a1b4e5b2b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:46 crc kubenswrapper[4707]: I1127 17:07:46.991795 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bf29c_7382c94e-e799-4343-8548-7efd92ed66e8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:47 crc kubenswrapper[4707]: I1127 17:07:47.238633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29404381-kd2qt_87152575-e530-4043-87fb-c7e50bfa9f00/keystone-cron/0.log" Nov 27 17:07:47 crc kubenswrapper[4707]: I1127 17:07:47.243342 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85c75486d4-pmkdv_af1b5c91-e184-49fe-9ad9-83f047d5123d/keystone-api/0.log" Nov 27 17:07:47 crc kubenswrapper[4707]: I1127 17:07:47.284628 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_fb6d94a9-d059-4cba-a6f3-8590d2491bb2/kube-state-metrics/0.log" Nov 27 17:07:47 crc kubenswrapper[4707]: I1127 17:07:47.477221 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq_18fd2519-f36c-4817-85da-7615979c3340/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:47 crc kubenswrapper[4707]: I1127 17:07:47.763798 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69588f8b9-6plc2_5de06db5-fa17-40e5-a08d-9b7f139b08ed/neutron-httpd/0.log" Nov 27 17:07:47 crc kubenswrapper[4707]: I1127 17:07:47.776327 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69588f8b9-6plc2_5de06db5-fa17-40e5-a08d-9b7f139b08ed/neutron-api/0.log" Nov 27 17:07:47 crc kubenswrapper[4707]: I1127 17:07:47.853328 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w_9f5df211-3c1b-45f3-9b61-a7fde58d8a39/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:48 crc kubenswrapper[4707]: I1127 17:07:48.195192 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:07:48 crc kubenswrapper[4707]: E1127 17:07:48.195411 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:07:48 crc kubenswrapper[4707]: I1127 17:07:48.422417 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0e31f3ae-2d72-4ba4-bf44-660b172c5066/nova-api-log/0.log" Nov 27 17:07:48 crc kubenswrapper[4707]: I1127 17:07:48.588499 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0e31f3ae-2d72-4ba4-bf44-660b172c5066/nova-api-api/0.log" Nov 27 17:07:48 crc kubenswrapper[4707]: I1127 17:07:48.969168 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2d07ee1e-4a93-4fe6-909d-2cc2a11993c7/nova-cell0-conductor-conductor/0.log" Nov 27 17:07:49 crc kubenswrapper[4707]: I1127 17:07:49.018699 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_af3bfed8-f098-4557-a577-0a10317ee805/nova-cell1-conductor-conductor/0.log" Nov 27 17:07:49 crc kubenswrapper[4707]: I1127 17:07:49.225670 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bd58d9e8-77d0-412d-b866-c10f989dc824/nova-cell1-novncproxy-novncproxy/0.log" Nov 27 17:07:49 crc kubenswrapper[4707]: I1127 17:07:49.357484 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lxzx6_c0d7830e-74a5-4ea0-b396-0095a96496be/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:49 crc kubenswrapper[4707]: I1127 17:07:49.509494 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3ee5c40c-527c-45b7-af82-93d55d4709c9/nova-metadata-log/0.log" Nov 27 17:07:49 crc kubenswrapper[4707]: I1127 17:07:49.692043 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b2d8aab4-7e14-47dd-83ad-80e0272c12cc/nova-scheduler-scheduler/0.log" Nov 27 17:07:49 crc kubenswrapper[4707]: I1127 17:07:49.778815 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4b42d58-27cb-455f-9994-ae15f433e008/mysql-bootstrap/0.log" Nov 27 17:07:50 crc kubenswrapper[4707]: I1127 17:07:50.032734 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4b42d58-27cb-455f-9994-ae15f433e008/galera/0.log" Nov 27 17:07:50 crc kubenswrapper[4707]: I1127 17:07:50.043512 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4b42d58-27cb-455f-9994-ae15f433e008/mysql-bootstrap/0.log" Nov 27 17:07:50 crc kubenswrapper[4707]: I1127 17:07:50.224161 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_84e88e9e-3edb-45cd-9973-1447587f7adc/mysql-bootstrap/0.log" Nov 27 17:07:50 crc kubenswrapper[4707]: I1127 17:07:50.486617 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_84e88e9e-3edb-45cd-9973-1447587f7adc/mysql-bootstrap/0.log" Nov 27 17:07:50 crc kubenswrapper[4707]: I1127 17:07:50.524253 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_84e88e9e-3edb-45cd-9973-1447587f7adc/galera/0.log" Nov 27 17:07:50 crc kubenswrapper[4707]: I1127 17:07:50.711762 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_247a20f8-a665-4383-944a-6fe111045aa1/openstackclient/0.log" Nov 27 17:07:50 crc kubenswrapper[4707]: I1127 17:07:50.789613 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3ee5c40c-527c-45b7-af82-93d55d4709c9/nova-metadata-metadata/0.log" Nov 27 17:07:50 crc kubenswrapper[4707]: I1127 17:07:50.829659 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9br7v_d951ce68-e4f2-4ead-aaef-b264f721d7a3/openstack-network-exporter/0.log" Nov 27 17:07:51 crc kubenswrapper[4707]: I1127 17:07:51.018260 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7n9c2_720e549c-1f41-4fb6-b29f-465ac7e174e3/ovsdb-server-init/0.log" Nov 27 17:07:51 crc kubenswrapper[4707]: I1127 17:07:51.183797 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7n9c2_720e549c-1f41-4fb6-b29f-465ac7e174e3/ovsdb-server-init/0.log" Nov 27 17:07:51 crc kubenswrapper[4707]: I1127 17:07:51.242049 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7n9c2_720e549c-1f41-4fb6-b29f-465ac7e174e3/ovs-vswitchd/0.log" Nov 27 17:07:51 crc kubenswrapper[4707]: I1127 17:07:51.281726 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7n9c2_720e549c-1f41-4fb6-b29f-465ac7e174e3/ovsdb-server/0.log" Nov 27 17:07:51 crc kubenswrapper[4707]: I1127 17:07:51.482034 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vvkr6_9639769b-4439-4ffc-b88b-cba953013bff/ovn-controller/0.log" Nov 27 17:07:51 crc kubenswrapper[4707]: I1127 17:07:51.511794 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wffqs_6001ceb1-ba83-4942-a49c-d7a6116f57f5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:51 crc kubenswrapper[4707]: I1127 17:07:51.654002 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2/openstack-network-exporter/0.log" Nov 27 17:07:51 crc kubenswrapper[4707]: I1127 17:07:51.711142 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2/ovn-northd/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.089432 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_379e0975-7a52-4f96-b931-4c02377d6537/ovsdbserver-nb/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.130633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_379e0975-7a52-4f96-b931-4c02377d6537/openstack-network-exporter/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.208780 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6f4624a-1407-4ff8-bd7f-90f2a0fd6718/openstack-network-exporter/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.310074 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6f4624a-1407-4ff8-bd7f-90f2a0fd6718/ovsdbserver-sb/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.481930 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69b8fb6b88-w6pxv_9246eafb-e806-45a4-bc87-9a7724b7467c/placement-api/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.552904 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69b8fb6b88-w6pxv_9246eafb-e806-45a4-bc87-9a7724b7467c/placement-log/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.655714 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/init-config-reloader/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.873022 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/config-reloader/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.883294 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/init-config-reloader/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.904043 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/thanos-sidecar/0.log" Nov 27 17:07:52 crc kubenswrapper[4707]: I1127 17:07:52.904246 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/prometheus/0.log" Nov 27 17:07:53 crc kubenswrapper[4707]: I1127 17:07:53.079060 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4990dbfc-6c12-4964-9d50-b8fd331cc123/setup-container/0.log" Nov 27 17:07:53 crc kubenswrapper[4707]: I1127 17:07:53.296228 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4990dbfc-6c12-4964-9d50-b8fd331cc123/rabbitmq/0.log" Nov 27 17:07:53 crc kubenswrapper[4707]: I1127 17:07:53.337618 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82ba4b51-2b4f-4ed6-8ef9-453386ff71da/setup-container/0.log" Nov 27 17:07:53 crc kubenswrapper[4707]: I1127 17:07:53.341219 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4990dbfc-6c12-4964-9d50-b8fd331cc123/setup-container/0.log" Nov 27 17:07:53 crc kubenswrapper[4707]: I1127 17:07:53.542738 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82ba4b51-2b4f-4ed6-8ef9-453386ff71da/setup-container/0.log" Nov 27 17:07:53 crc kubenswrapper[4707]: I1127 17:07:53.632709 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm_d89ddbb0-c0d3-46a8-a81e-ce2809f1352a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:53 crc kubenswrapper[4707]: I1127 17:07:53.776286 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-p954g_6f4ca349-556f-4de4-b23d-b00a59768241/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:54 crc kubenswrapper[4707]: I1127 17:07:53.999912 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx_92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:54 crc kubenswrapper[4707]: I1127 17:07:54.138870 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bf2x7_a03595a4-c76f-4642-b492-17f393096888/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:54 crc kubenswrapper[4707]: I1127 17:07:54.283615 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9kfcd_29ac3bd9-0e37-4e00-aa44-a09c01019b96/ssh-known-hosts-edpm-deployment/0.log" Nov 27 17:07:54 crc kubenswrapper[4707]: I1127 17:07:54.570673 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5475fd4f89-8stjv_7362e7e3-1145-4e89-84db-343739624472/proxy-server/0.log" Nov 27 17:07:54 crc kubenswrapper[4707]: I1127 17:07:54.690739 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5475fd4f89-8stjv_7362e7e3-1145-4e89-84db-343739624472/proxy-httpd/0.log" Nov 27 17:07:54 crc kubenswrapper[4707]: I1127 17:07:54.743728 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vbngp_26d4145c-3144-4e1f-99ce-08d64f8b20be/swift-ring-rebalance/0.log" Nov 27 17:07:54 crc kubenswrapper[4707]: I1127 17:07:54.913719 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/account-auditor/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.009867 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/account-reaper/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.151935 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/account-replicator/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.233409 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/container-auditor/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.235022 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/account-server/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.370117 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/container-replicator/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.450265 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/container-server/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.511074 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/container-updater/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.651729 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-expirer/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.679025 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-auditor/0.log" Nov 27 17:07:55 crc kubenswrapper[4707]: I1127 17:07:55.733399 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-replicator/0.log" Nov 27 17:07:56 crc kubenswrapper[4707]: I1127 17:07:56.094521 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-server/0.log" Nov 27 17:07:56 crc kubenswrapper[4707]: I1127 17:07:56.154036 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/rsync/0.log" Nov 27 17:07:56 crc kubenswrapper[4707]: I1127 17:07:56.161357 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-updater/0.log" Nov 27 17:07:56 crc kubenswrapper[4707]: I1127 17:07:56.329152 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/swift-recon-cron/0.log" Nov 27 17:07:56 crc kubenswrapper[4707]: I1127 17:07:56.459925 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-pk25z_03d3491e-8e8f-49a2-8552-f939d87bbb59/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:07:56 crc kubenswrapper[4707]: I1127 17:07:56.613573 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82ba4b51-2b4f-4ed6-8ef9-453386ff71da/rabbitmq/0.log" Nov 27 17:07:56 crc kubenswrapper[4707]: I1127 17:07:56.887554 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm_4e28fa93-baff-4fad-91cc-7ef262dcd775/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:08:00 crc kubenswrapper[4707]: I1127 17:08:00.194961 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:08:00 crc kubenswrapper[4707]: E1127 17:08:00.195875 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:08:04 crc kubenswrapper[4707]: I1127 17:08:04.329725 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c5402fa0-f1b7-4561-95f0-cb690caf9b58/memcached/0.log" Nov 27 17:08:12 crc kubenswrapper[4707]: I1127 17:08:12.195814 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:08:12 crc kubenswrapper[4707]: E1127 17:08:12.196547 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:08:25 crc kubenswrapper[4707]: I1127 17:08:25.205955 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:08:25 crc kubenswrapper[4707]: E1127 17:08:25.207927 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:08:26 crc kubenswrapper[4707]: I1127 17:08:26.868167 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/util/0.log" Nov 27 17:08:27 crc kubenswrapper[4707]: I1127 17:08:27.712716 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/util/0.log" Nov 27 17:08:27 crc kubenswrapper[4707]: I1127 17:08:27.724163 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/pull/0.log" Nov 27 17:08:27 crc kubenswrapper[4707]: I1127 17:08:27.753102 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/pull/0.log" Nov 27 17:08:27 crc kubenswrapper[4707]: I1127 17:08:27.932959 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/util/0.log" Nov 27 17:08:27 crc kubenswrapper[4707]: I1127 17:08:27.933802 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/pull/0.log" Nov 27 17:08:27 crc kubenswrapper[4707]: I1127 17:08:27.938115 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/extract/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.099403 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-nhkfx_0267ac3a-4bee-42b9-a506-e2b1e1e3726e/kube-rbac-proxy/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.153724 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-nhkfx_0267ac3a-4bee-42b9-a506-e2b1e1e3726e/manager/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.171345 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-sgwpt_6d78cbb1-56e6-428d-bba4-5d1edbbda363/kube-rbac-proxy/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.334640 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-sgwpt_6d78cbb1-56e6-428d-bba4-5d1edbbda363/manager/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.373942 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-f5g2d_5e9f9859-1f28-4183-b71c-e9459e2746b7/kube-rbac-proxy/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.430702 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-f5g2d_5e9f9859-1f28-4183-b71c-e9459e2746b7/manager/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.563857 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-cd4lh_99785491-bcbd-4946-b1a6-a3e08a4394b5/kube-rbac-proxy/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.699784 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-cd4lh_99785491-bcbd-4946-b1a6-a3e08a4394b5/manager/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.715114 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-8qhwz_a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660/kube-rbac-proxy/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.876944 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-8qhwz_a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660/manager/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.923114 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-hg8j4_ac946592-ee39-443e-b64a-980caaace080/manager/0.log" Nov 27 17:08:28 crc kubenswrapper[4707]: I1127 17:08:28.926465 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-hg8j4_ac946592-ee39-443e-b64a-980caaace080/kube-rbac-proxy/0.log" Nov 27 17:08:29 crc kubenswrapper[4707]: I1127 17:08:29.582811 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-kwlbb_88f24787-fedc-4d08-9a8e-16a24f242d02/kube-rbac-proxy/0.log" Nov 27 17:08:29 crc kubenswrapper[4707]: I1127 17:08:29.816621 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-kwlbb_88f24787-fedc-4d08-9a8e-16a24f242d02/manager/0.log" Nov 27 17:08:29 crc kubenswrapper[4707]: I1127 17:08:29.916385 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-5bj45_8fea437e-0a8c-4836-b23c-56db9c7ea0fc/kube-rbac-proxy/0.log" Nov 27 17:08:29 crc kubenswrapper[4707]: I1127 17:08:29.940751 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-5bj45_8fea437e-0a8c-4836-b23c-56db9c7ea0fc/manager/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.225708 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-4cd96_dc2542ce-f2fd-454b-b47f-92d3bbc93d91/kube-rbac-proxy/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.314664 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-x5k9j_126e10c6-2740-47fc-8331-a8e4bb6549b8/kube-rbac-proxy/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.426735 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-x5k9j_126e10c6-2740-47fc-8331-a8e4bb6549b8/manager/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.430861 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-4cd96_dc2542ce-f2fd-454b-b47f-92d3bbc93d91/manager/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.559152 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-dfw47_50771ff9-4409-4f41-ad3c-98f730dbff77/kube-rbac-proxy/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.685496 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-dfw47_50771ff9-4409-4f41-ad3c-98f730dbff77/manager/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.796783 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-4bqw7_9377949b-5979-44ff-bd3f-ea1389b4ef6f/kube-rbac-proxy/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.847394 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-4bqw7_9377949b-5979-44ff-bd3f-ea1389b4ef6f/manager/0.log" Nov 27 17:08:30 crc kubenswrapper[4707]: I1127 17:08:30.898163 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-q6vcr_20e41446-8d89-481e-bd9f-48dc14efb82e/kube-rbac-proxy/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.068421 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-q6vcr_20e41446-8d89-481e-bd9f-48dc14efb82e/manager/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.082809 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-blnpn_7c88f676-b4d3-46b2-aedd-eff62f8f1bfb/kube-rbac-proxy/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.110776 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-blnpn_7c88f676-b4d3-46b2-aedd-eff62f8f1bfb/manager/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.259998 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww_7761d2b0-8cc7-4dc8-a956-df20e2efc081/manager/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.330096 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww_7761d2b0-8cc7-4dc8-a956-df20e2efc081/kube-rbac-proxy/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.582215 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qdn2b_4b76b0a5-e84e-427d-9cb4-4fac9969a278/registry-server/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.686280 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7d7f8454cc-87c8c_b517a267-0265-4e24-b102-b19b8d9eee18/operator/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.773347 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-nbgp6_6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841/kube-rbac-proxy/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.906067 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-nbgp6_6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841/manager/0.log" Nov 27 17:08:31 crc kubenswrapper[4707]: I1127 17:08:31.964700 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-v48r5_3c3bf501-b545-45a2-b186-2df94990295d/kube-rbac-proxy/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.015445 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-v48r5_3c3bf501-b545-45a2-b186-2df94990295d/manager/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.214068 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-v96x7_6647f986-9d62-4939-907b-fde960b30a37/operator/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.272699 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-qdh2s_efa00950-13dd-4e9e-a215-6ebb89006545/kube-rbac-proxy/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.423064 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-qdh2s_efa00950-13dd-4e9e-a215-6ebb89006545/manager/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.557865 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/kube-rbac-proxy/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.729454 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-qwb4t_97bb6c80-6996-4e91-bcdf-0f1c20e72fa3/manager/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.760877 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/manager/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.763064 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-qwb4t_97bb6c80-6996-4e91-bcdf-0f1c20e72fa3/kube-rbac-proxy/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.946575 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-hgbtk_00e99b4b-2bbd-445a-b075-74c47fe30f79/manager/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.995122 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54fcbb7454-dqgm9_894c1749-fccf-4178-b7a8-6c63e18266f6/manager/0.log" Nov 27 17:08:32 crc kubenswrapper[4707]: I1127 17:08:32.999831 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-hgbtk_00e99b4b-2bbd-445a-b075-74c47fe30f79/kube-rbac-proxy/0.log" Nov 27 17:08:37 crc kubenswrapper[4707]: I1127 17:08:37.196100 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:08:37 crc kubenswrapper[4707]: E1127 17:08:37.196827 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:08:48 crc kubenswrapper[4707]: I1127 17:08:48.203903 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:08:48 crc kubenswrapper[4707]: E1127 17:08:48.204978 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:08:53 crc kubenswrapper[4707]: I1127 17:08:53.786543 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h746w_e8027394-2524-45df-8cdc-967024215d25/control-plane-machine-set-operator/0.log" Nov 27 17:08:53 crc kubenswrapper[4707]: I1127 17:08:53.972860 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-thf5c_78b8999d-9535-4584-baa0-5fd38838ac29/machine-api-operator/0.log" Nov 27 17:08:53 crc kubenswrapper[4707]: I1127 17:08:53.977741 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-thf5c_78b8999d-9535-4584-baa0-5fd38838ac29/kube-rbac-proxy/0.log" Nov 27 17:09:00 crc kubenswrapper[4707]: I1127 17:09:00.195920 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:09:00 crc kubenswrapper[4707]: E1127 17:09:00.199315 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:09:06 crc kubenswrapper[4707]: I1127 17:09:06.369312 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nc56g_4a4bd04a-ce38-46fe-a197-17214e851643/cert-manager-controller/0.log" Nov 27 17:09:06 crc kubenswrapper[4707]: I1127 17:09:06.544257 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-7j6cd_5f4b6258-1be2-489a-9e59-86df4534d663/cert-manager-cainjector/0.log" Nov 27 17:09:06 crc kubenswrapper[4707]: I1127 17:09:06.586398 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-5476f_9efe65d7-bb46-43bd-a343-c9a28fbad2ea/cert-manager-webhook/0.log" Nov 27 17:09:11 crc kubenswrapper[4707]: I1127 17:09:11.195712 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:09:11 crc kubenswrapper[4707]: E1127 17:09:11.196487 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:09:18 crc kubenswrapper[4707]: I1127 17:09:18.586184 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-cfrns_2de77ffe-0aaf-4a49-86a3-3bb9a0123497/nmstate-console-plugin/0.log" Nov 27 17:09:18 crc kubenswrapper[4707]: I1127 17:09:18.757053 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-svvpq_d5b862a0-5504-43b2-9f8f-fa953310a52d/nmstate-handler/0.log" Nov 27 17:09:18 crc kubenswrapper[4707]: I1127 17:09:18.823683 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-zmcdf_444e03f1-9114-4150-8f28-3db614bb32e0/kube-rbac-proxy/0.log" Nov 27 17:09:18 crc kubenswrapper[4707]: I1127 17:09:18.897278 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-zmcdf_444e03f1-9114-4150-8f28-3db614bb32e0/nmstate-metrics/0.log" Nov 27 17:09:18 crc kubenswrapper[4707]: I1127 17:09:18.971151 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-v69r4_0e1c66da-d3ca-4b17-83b7-62518c83721c/nmstate-operator/0.log" Nov 27 17:09:19 crc kubenswrapper[4707]: I1127 17:09:19.096604 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-h6v87_d051acac-59f4-434a-85bb-2cf7ec7e7107/nmstate-webhook/0.log" Nov 27 17:09:22 crc kubenswrapper[4707]: I1127 17:09:22.195772 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:09:22 crc kubenswrapper[4707]: E1127 17:09:22.196699 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.057081 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-fcnd7_0b63951d-0fad-479e-9a1d-e3978d75f5db/kube-rbac-proxy/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.177837 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-fcnd7_0b63951d-0fad-479e-9a1d-e3978d75f5db/controller/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.294020 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-frr-files/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.485850 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-reloader/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.494034 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-frr-files/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.547477 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-metrics/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.596656 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-reloader/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.709960 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-reloader/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.711433 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-frr-files/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.788690 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-metrics/0.log" Nov 27 17:09:33 crc kubenswrapper[4707]: I1127 17:09:33.821665 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-metrics/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.014842 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-metrics/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.040392 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-frr-files/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.040454 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-reloader/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.064927 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/controller/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.195733 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:09:34 crc kubenswrapper[4707]: E1127 17:09:34.196045 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.217024 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/kube-rbac-proxy/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.225034 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/frr-metrics/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.285704 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/kube-rbac-proxy-frr/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.468165 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-krwxc_0dfd8369-d85e-41e1-8990-123e0de5e7d4/frr-k8s-webhook-server/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.566180 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/reloader/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.772001 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-669b86894d-bdzj6_f3f84476-2107-4226-be6c-cdcc6380a697/manager/0.log" Nov 27 17:09:34 crc kubenswrapper[4707]: I1127 17:09:34.926005 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7bddc5f445-p9w4t_3226579b-b878-4494-83ca-cc7288089a7a/webhook-server/0.log" Nov 27 17:09:35 crc kubenswrapper[4707]: I1127 17:09:35.038498 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m8h9q_8c251e95-c920-4c32-b7be-95367e79b151/kube-rbac-proxy/0.log" Nov 27 17:09:35 crc kubenswrapper[4707]: I1127 17:09:35.713656 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m8h9q_8c251e95-c920-4c32-b7be-95367e79b151/speaker/0.log" Nov 27 17:09:35 crc kubenswrapper[4707]: I1127 17:09:35.985471 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/frr/0.log" Nov 27 17:09:45 crc kubenswrapper[4707]: I1127 17:09:45.209610 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:09:45 crc kubenswrapper[4707]: E1127 17:09:45.210389 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.130678 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/util/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.309768 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/pull/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.321034 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/util/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.336388 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/pull/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.537988 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/extract/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.538539 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/pull/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.608474 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/util/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.685577 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/util/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.853044 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/util/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.871173 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/pull/0.log" Nov 27 17:09:48 crc kubenswrapper[4707]: I1127 17:09:48.902177 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/pull/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.065138 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/extract/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.071701 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/util/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.109352 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/pull/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.256114 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/util/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.450632 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/util/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.468124 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/pull/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.538530 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/pull/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.645723 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/util/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.660315 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/extract/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.687227 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/pull/0.log" Nov 27 17:09:49 crc kubenswrapper[4707]: I1127 17:09:49.870872 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-utilities/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.107636 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-utilities/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.128353 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-content/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.169442 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-content/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.286439 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-utilities/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.337471 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-content/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.561384 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-utilities/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.874191 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/registry-server/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.899237 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-content/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.899456 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-content/0.log" Nov 27 17:09:50 crc kubenswrapper[4707]: I1127 17:09:50.950360 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-utilities/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.114078 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-content/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.118441 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-utilities/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.323102 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k5pzx_d8fb3604-08bd-4ad8-9838-8275101534c7/marketplace-operator/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.476767 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-utilities/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.693757 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/registry-server/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.695798 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-utilities/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.764395 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-content/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.777662 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-content/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.959849 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-utilities/0.log" Nov 27 17:09:51 crc kubenswrapper[4707]: I1127 17:09:51.972196 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-content/0.log" Nov 27 17:09:52 crc kubenswrapper[4707]: I1127 17:09:52.090659 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-utilities/0.log" Nov 27 17:09:52 crc kubenswrapper[4707]: I1127 17:09:52.180232 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/registry-server/0.log" Nov 27 17:09:52 crc kubenswrapper[4707]: I1127 17:09:52.295023 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-utilities/0.log" Nov 27 17:09:52 crc kubenswrapper[4707]: I1127 17:09:52.321474 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-content/0.log" Nov 27 17:09:52 crc kubenswrapper[4707]: I1127 17:09:52.322317 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-content/0.log" Nov 27 17:09:52 crc kubenswrapper[4707]: I1127 17:09:52.526954 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-content/0.log" Nov 27 17:09:52 crc kubenswrapper[4707]: I1127 17:09:52.545456 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-utilities/0.log" Nov 27 17:09:52 crc kubenswrapper[4707]: I1127 17:09:52.962028 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/registry-server/0.log" Nov 27 17:10:00 crc kubenswrapper[4707]: I1127 17:10:00.195492 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:10:00 crc kubenswrapper[4707]: E1127 17:10:00.196326 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:10:05 crc kubenswrapper[4707]: I1127 17:10:05.005133 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-zwnr6_fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa/prometheus-operator/0.log" Nov 27 17:10:05 crc kubenswrapper[4707]: I1127 17:10:05.176413 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz_e1ab71b3-d476-4099-b4d8-c59b437dd4f7/prometheus-operator-admission-webhook/0.log" Nov 27 17:10:05 crc kubenswrapper[4707]: I1127 17:10:05.187166 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp_4208373c-9a43-406a-8262-0aff47b60f85/prometheus-operator-admission-webhook/0.log" Nov 27 17:10:05 crc kubenswrapper[4707]: I1127 17:10:05.372440 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-82wd5_e96caaf7-dd4f-4734-8601-23d38df9005f/operator/0.log" Nov 27 17:10:05 crc kubenswrapper[4707]: I1127 17:10:05.387890 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-jgbmm_0c162a13-e55c-44f0-9ab7-cc7ce6d87605/perses-operator/0.log" Nov 27 17:10:14 crc kubenswrapper[4707]: I1127 17:10:14.195870 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:10:14 crc kubenswrapper[4707]: E1127 17:10:14.196701 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:10:26 crc kubenswrapper[4707]: I1127 17:10:26.195661 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:10:26 crc kubenswrapper[4707]: E1127 17:10:26.196484 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:10:38 crc kubenswrapper[4707]: I1127 17:10:38.196180 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:10:38 crc kubenswrapper[4707]: I1127 17:10:38.940942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"7df475a96b4c84af5dc879adfdcbbe0e67a53f0090543ba36dc812a5f72b0be9"} Nov 27 17:11:35 crc kubenswrapper[4707]: I1127 17:11:35.547445 4707 generic.go:334] "Generic (PLEG): container finished" podID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerID="de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74" exitCode=0 Nov 27 17:11:35 crc kubenswrapper[4707]: I1127 17:11:35.547560 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" event={"ID":"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd","Type":"ContainerDied","Data":"de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74"} Nov 27 17:11:35 crc kubenswrapper[4707]: I1127 17:11:35.548823 4707 scope.go:117] "RemoveContainer" containerID="de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74" Nov 27 17:11:36 crc kubenswrapper[4707]: I1127 17:11:36.184620 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bhqp9_must-gather-t2zcf_01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd/gather/0.log" Nov 27 17:11:44 crc kubenswrapper[4707]: I1127 17:11:44.403152 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bhqp9/must-gather-t2zcf"] Nov 27 17:11:44 crc kubenswrapper[4707]: I1127 17:11:44.404035 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" podUID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerName="copy" containerID="cri-o://566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f" gracePeriod=2 Nov 27 17:11:44 crc kubenswrapper[4707]: I1127 17:11:44.412033 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bhqp9/must-gather-t2zcf"] Nov 27 17:11:44 crc kubenswrapper[4707]: E1127 17:11:44.842095 4707 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01e74cc9_d6b3_4a7d_b1e3_a477fc1cf8cd.slice/crio-conmon-566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f.scope\": RecentStats: unable to find data in memory cache]" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.185896 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bhqp9_must-gather-t2zcf_01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd/copy/0.log" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.186487 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.316736 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-must-gather-output\") pod \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\" (UID: \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\") " Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.316932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmmjp\" (UniqueName: \"kubernetes.io/projected/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-kube-api-access-pmmjp\") pod \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\" (UID: \"01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd\") " Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.324797 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-kube-api-access-pmmjp" (OuterVolumeSpecName: "kube-api-access-pmmjp") pod "01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" (UID: "01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd"). InnerVolumeSpecName "kube-api-access-pmmjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.419379 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmmjp\" (UniqueName: \"kubernetes.io/projected/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-kube-api-access-pmmjp\") on node \"crc\" DevicePath \"\"" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.468854 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" (UID: "01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.521600 4707 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.645420 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bhqp9_must-gather-t2zcf_01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd/copy/0.log" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.645820 4707 generic.go:334] "Generic (PLEG): container finished" podID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerID="566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f" exitCode=143 Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.645893 4707 scope.go:117] "RemoveContainer" containerID="566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.646055 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bhqp9/must-gather-t2zcf" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.666579 4707 scope.go:117] "RemoveContainer" containerID="de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.735073 4707 scope.go:117] "RemoveContainer" containerID="566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f" Nov 27 17:11:45 crc kubenswrapper[4707]: E1127 17:11:45.735521 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f\": container with ID starting with 566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f not found: ID does not exist" containerID="566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.735575 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f"} err="failed to get container status \"566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f\": rpc error: code = NotFound desc = could not find container \"566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f\": container with ID starting with 566c365908703a7c822f755f595c62588dcca51048a9c498e97e1e1a7156589f not found: ID does not exist" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.735608 4707 scope.go:117] "RemoveContainer" containerID="de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74" Nov 27 17:11:45 crc kubenswrapper[4707]: E1127 17:11:45.735883 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74\": container with ID starting with de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74 not found: ID does not exist" containerID="de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74" Nov 27 17:11:45 crc kubenswrapper[4707]: I1127 17:11:45.735919 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74"} err="failed to get container status \"de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74\": rpc error: code = NotFound desc = could not find container \"de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74\": container with ID starting with de86bcc8a66e042585b3daed35e4bb8cb35d43512fec07bcd4fd3556c4a71f74 not found: ID does not exist" Nov 27 17:11:47 crc kubenswrapper[4707]: I1127 17:11:47.215053 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" path="/var/lib/kubelet/pods/01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd/volumes" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.182908 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nhdwb"] Nov 27 17:12:10 crc kubenswrapper[4707]: E1127 17:12:10.183988 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c212b6-24e6-48a9-b435-43dad584eac8" containerName="container-00" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.184019 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c212b6-24e6-48a9-b435-43dad584eac8" containerName="container-00" Nov 27 17:12:10 crc kubenswrapper[4707]: E1127 17:12:10.184048 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerName="gather" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.184057 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerName="gather" Nov 27 17:12:10 crc kubenswrapper[4707]: E1127 17:12:10.184090 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerName="copy" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.184097 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerName="copy" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.184324 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerName="gather" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.184335 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c212b6-24e6-48a9-b435-43dad584eac8" containerName="container-00" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.184354 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e74cc9-d6b3-4a7d-b1e3-a477fc1cf8cd" containerName="copy" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.186395 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.196018 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nhdwb"] Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.262916 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-catalog-content\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.262983 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4sks\" (UniqueName: \"kubernetes.io/projected/91627ea6-1e3e-470a-b2cf-dccfd8631366-kube-api-access-h4sks\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.263168 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-utilities\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.364603 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-catalog-content\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.364689 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4sks\" (UniqueName: \"kubernetes.io/projected/91627ea6-1e3e-470a-b2cf-dccfd8631366-kube-api-access-h4sks\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.364757 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-utilities\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.365215 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-catalog-content\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.365255 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-utilities\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.383554 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4sks\" (UniqueName: \"kubernetes.io/projected/91627ea6-1e3e-470a-b2cf-dccfd8631366-kube-api-access-h4sks\") pod \"certified-operators-nhdwb\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:10 crc kubenswrapper[4707]: I1127 17:12:10.508558 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:11 crc kubenswrapper[4707]: I1127 17:12:11.099605 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nhdwb"] Nov 27 17:12:11 crc kubenswrapper[4707]: I1127 17:12:11.939758 4707 generic.go:334] "Generic (PLEG): container finished" podID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerID="295763f1ced2d6dc1c920399af1fb6c0ce6f805e8c0b497b7bc10b6e612681ef" exitCode=0 Nov 27 17:12:11 crc kubenswrapper[4707]: I1127 17:12:11.939811 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhdwb" event={"ID":"91627ea6-1e3e-470a-b2cf-dccfd8631366","Type":"ContainerDied","Data":"295763f1ced2d6dc1c920399af1fb6c0ce6f805e8c0b497b7bc10b6e612681ef"} Nov 27 17:12:11 crc kubenswrapper[4707]: I1127 17:12:11.940086 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhdwb" event={"ID":"91627ea6-1e3e-470a-b2cf-dccfd8631366","Type":"ContainerStarted","Data":"2944af612fdac5ee9140c51c21fbbfc81124eb5aa0030b31189f68570cb7a1c7"} Nov 27 17:12:11 crc kubenswrapper[4707]: I1127 17:12:11.943883 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:12:14 crc kubenswrapper[4707]: I1127 17:12:14.969794 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhdwb" event={"ID":"91627ea6-1e3e-470a-b2cf-dccfd8631366","Type":"ContainerStarted","Data":"77af82bbbd24addf11580a03255a2f90ac737d149f507b97b5f5bbe3bfed4128"} Nov 27 17:12:16 crc kubenswrapper[4707]: I1127 17:12:16.991197 4707 generic.go:334] "Generic (PLEG): container finished" podID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerID="77af82bbbd24addf11580a03255a2f90ac737d149f507b97b5f5bbe3bfed4128" exitCode=0 Nov 27 17:12:16 crc kubenswrapper[4707]: I1127 17:12:16.991259 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhdwb" event={"ID":"91627ea6-1e3e-470a-b2cf-dccfd8631366","Type":"ContainerDied","Data":"77af82bbbd24addf11580a03255a2f90ac737d149f507b97b5f5bbe3bfed4128"} Nov 27 17:12:19 crc kubenswrapper[4707]: I1127 17:12:19.017364 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhdwb" event={"ID":"91627ea6-1e3e-470a-b2cf-dccfd8631366","Type":"ContainerStarted","Data":"e7a4a154f1776692c7616a1ee1faab24216cc8b3cea0b86d17ae3a742f8eed24"} Nov 27 17:12:19 crc kubenswrapper[4707]: I1127 17:12:19.043981 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nhdwb" podStartSLOduration=2.912166954 podStartE2EDuration="9.043958149s" podCreationTimestamp="2025-11-27 17:12:10 +0000 UTC" firstStartedPulling="2025-11-27 17:12:11.943616488 +0000 UTC m=+4107.575065256" lastFinishedPulling="2025-11-27 17:12:18.075407683 +0000 UTC m=+4113.706856451" observedRunningTime="2025-11-27 17:12:19.037384878 +0000 UTC m=+4114.668833646" watchObservedRunningTime="2025-11-27 17:12:19.043958149 +0000 UTC m=+4114.675406917" Nov 27 17:12:20 crc kubenswrapper[4707]: I1127 17:12:20.509302 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:20 crc kubenswrapper[4707]: I1127 17:12:20.509738 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:20 crc kubenswrapper[4707]: I1127 17:12:20.582490 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:30 crc kubenswrapper[4707]: I1127 17:12:30.568631 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:30 crc kubenswrapper[4707]: I1127 17:12:30.622198 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nhdwb"] Nov 27 17:12:31 crc kubenswrapper[4707]: I1127 17:12:31.149813 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nhdwb" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerName="registry-server" containerID="cri-o://e7a4a154f1776692c7616a1ee1faab24216cc8b3cea0b86d17ae3a742f8eed24" gracePeriod=2 Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.170764 4707 generic.go:334] "Generic (PLEG): container finished" podID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerID="e7a4a154f1776692c7616a1ee1faab24216cc8b3cea0b86d17ae3a742f8eed24" exitCode=0 Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.170855 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhdwb" event={"ID":"91627ea6-1e3e-470a-b2cf-dccfd8631366","Type":"ContainerDied","Data":"e7a4a154f1776692c7616a1ee1faab24216cc8b3cea0b86d17ae3a742f8eed24"} Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.382815 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.487744 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-utilities\") pod \"91627ea6-1e3e-470a-b2cf-dccfd8631366\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.487806 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-catalog-content\") pod \"91627ea6-1e3e-470a-b2cf-dccfd8631366\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.487947 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4sks\" (UniqueName: \"kubernetes.io/projected/91627ea6-1e3e-470a-b2cf-dccfd8631366-kube-api-access-h4sks\") pod \"91627ea6-1e3e-470a-b2cf-dccfd8631366\" (UID: \"91627ea6-1e3e-470a-b2cf-dccfd8631366\") " Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.489338 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-utilities" (OuterVolumeSpecName: "utilities") pod "91627ea6-1e3e-470a-b2cf-dccfd8631366" (UID: "91627ea6-1e3e-470a-b2cf-dccfd8631366"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.552751 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91627ea6-1e3e-470a-b2cf-dccfd8631366" (UID: "91627ea6-1e3e-470a-b2cf-dccfd8631366"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.558351 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91627ea6-1e3e-470a-b2cf-dccfd8631366-kube-api-access-h4sks" (OuterVolumeSpecName: "kube-api-access-h4sks") pod "91627ea6-1e3e-470a-b2cf-dccfd8631366" (UID: "91627ea6-1e3e-470a-b2cf-dccfd8631366"). InnerVolumeSpecName "kube-api-access-h4sks". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.589728 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4sks\" (UniqueName: \"kubernetes.io/projected/91627ea6-1e3e-470a-b2cf-dccfd8631366-kube-api-access-h4sks\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.589768 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:32 crc kubenswrapper[4707]: I1127 17:12:32.589778 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91627ea6-1e3e-470a-b2cf-dccfd8631366-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:33 crc kubenswrapper[4707]: I1127 17:12:33.182761 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhdwb" event={"ID":"91627ea6-1e3e-470a-b2cf-dccfd8631366","Type":"ContainerDied","Data":"2944af612fdac5ee9140c51c21fbbfc81124eb5aa0030b31189f68570cb7a1c7"} Nov 27 17:12:33 crc kubenswrapper[4707]: I1127 17:12:33.183057 4707 scope.go:117] "RemoveContainer" containerID="e7a4a154f1776692c7616a1ee1faab24216cc8b3cea0b86d17ae3a742f8eed24" Nov 27 17:12:33 crc kubenswrapper[4707]: I1127 17:12:33.182887 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhdwb" Nov 27 17:12:33 crc kubenswrapper[4707]: I1127 17:12:33.207541 4707 scope.go:117] "RemoveContainer" containerID="77af82bbbd24addf11580a03255a2f90ac737d149f507b97b5f5bbe3bfed4128" Nov 27 17:12:33 crc kubenswrapper[4707]: I1127 17:12:33.233960 4707 scope.go:117] "RemoveContainer" containerID="295763f1ced2d6dc1c920399af1fb6c0ce6f805e8c0b497b7bc10b6e612681ef" Nov 27 17:12:33 crc kubenswrapper[4707]: I1127 17:12:33.234880 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nhdwb"] Nov 27 17:12:33 crc kubenswrapper[4707]: I1127 17:12:33.245389 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nhdwb"] Nov 27 17:12:35 crc kubenswrapper[4707]: I1127 17:12:35.207681 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" path="/var/lib/kubelet/pods/91627ea6-1e3e-470a-b2cf-dccfd8631366/volumes" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.302226 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c94q9"] Nov 27 17:12:44 crc kubenswrapper[4707]: E1127 17:12:44.310174 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerName="extract-content" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.310197 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerName="extract-content" Nov 27 17:12:44 crc kubenswrapper[4707]: E1127 17:12:44.310246 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerName="registry-server" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.310253 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerName="registry-server" Nov 27 17:12:44 crc kubenswrapper[4707]: E1127 17:12:44.310271 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerName="extract-utilities" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.310279 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerName="extract-utilities" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.310572 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="91627ea6-1e3e-470a-b2cf-dccfd8631366" containerName="registry-server" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.312575 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.320036 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c94q9"] Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.380445 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-catalog-content\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.380497 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qfz\" (UniqueName: \"kubernetes.io/projected/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-kube-api-access-t8qfz\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.380596 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-utilities\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.481872 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-utilities\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.481976 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-catalog-content\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.482004 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qfz\" (UniqueName: \"kubernetes.io/projected/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-kube-api-access-t8qfz\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.482602 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-catalog-content\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.482696 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-utilities\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.501428 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qfz\" (UniqueName: \"kubernetes.io/projected/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-kube-api-access-t8qfz\") pod \"redhat-marketplace-c94q9\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:44 crc kubenswrapper[4707]: I1127 17:12:44.633503 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:45 crc kubenswrapper[4707]: I1127 17:12:45.132415 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c94q9"] Nov 27 17:12:45 crc kubenswrapper[4707]: I1127 17:12:45.318040 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c94q9" event={"ID":"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20","Type":"ContainerStarted","Data":"da212e112c0cec81d15b6cc25bd7726e5cef2996208c9e5bdcbff76af2d68e6b"} Nov 27 17:12:46 crc kubenswrapper[4707]: I1127 17:12:46.332055 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerID="9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a" exitCode=0 Nov 27 17:12:46 crc kubenswrapper[4707]: I1127 17:12:46.332161 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c94q9" event={"ID":"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20","Type":"ContainerDied","Data":"9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a"} Nov 27 17:12:47 crc kubenswrapper[4707]: I1127 17:12:47.343942 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c94q9" event={"ID":"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20","Type":"ContainerStarted","Data":"e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86"} Nov 27 17:12:48 crc kubenswrapper[4707]: I1127 17:12:48.353279 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerID="e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86" exitCode=0 Nov 27 17:12:48 crc kubenswrapper[4707]: I1127 17:12:48.353391 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c94q9" event={"ID":"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20","Type":"ContainerDied","Data":"e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86"} Nov 27 17:12:49 crc kubenswrapper[4707]: I1127 17:12:49.368324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c94q9" event={"ID":"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20","Type":"ContainerStarted","Data":"9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d"} Nov 27 17:12:49 crc kubenswrapper[4707]: I1127 17:12:49.393653 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c94q9" podStartSLOduration=2.687485363 podStartE2EDuration="5.393630181s" podCreationTimestamp="2025-11-27 17:12:44 +0000 UTC" firstStartedPulling="2025-11-27 17:12:46.335450396 +0000 UTC m=+4141.966899164" lastFinishedPulling="2025-11-27 17:12:49.041595184 +0000 UTC m=+4144.673043982" observedRunningTime="2025-11-27 17:12:49.390095694 +0000 UTC m=+4145.021544462" watchObservedRunningTime="2025-11-27 17:12:49.393630181 +0000 UTC m=+4145.025078949" Nov 27 17:12:54 crc kubenswrapper[4707]: I1127 17:12:54.634604 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:54 crc kubenswrapper[4707]: I1127 17:12:54.635158 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:54 crc kubenswrapper[4707]: I1127 17:12:54.676177 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:56 crc kubenswrapper[4707]: I1127 17:12:56.109328 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:56 crc kubenswrapper[4707]: I1127 17:12:56.164559 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c94q9"] Nov 27 17:12:57 crc kubenswrapper[4707]: I1127 17:12:57.446328 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c94q9" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerName="registry-server" containerID="cri-o://9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d" gracePeriod=2 Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.034238 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.201932 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8qfz\" (UniqueName: \"kubernetes.io/projected/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-kube-api-access-t8qfz\") pod \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.202432 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-utilities\") pod \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.202506 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-catalog-content\") pod \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\" (UID: \"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20\") " Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.203265 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-utilities" (OuterVolumeSpecName: "utilities") pod "d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" (UID: "d6a50b5d-3786-4afe-8d1d-aaf00b05cb20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.215778 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-kube-api-access-t8qfz" (OuterVolumeSpecName: "kube-api-access-t8qfz") pod "d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" (UID: "d6a50b5d-3786-4afe-8d1d-aaf00b05cb20"). InnerVolumeSpecName "kube-api-access-t8qfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.230798 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" (UID: "d6a50b5d-3786-4afe-8d1d-aaf00b05cb20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.304901 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.304931 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.304943 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8qfz\" (UniqueName: \"kubernetes.io/projected/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20-kube-api-access-t8qfz\") on node \"crc\" DevicePath \"\"" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.457496 4707 generic.go:334] "Generic (PLEG): container finished" podID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerID="9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d" exitCode=0 Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.457539 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c94q9" event={"ID":"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20","Type":"ContainerDied","Data":"9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d"} Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.457550 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c94q9" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.457568 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c94q9" event={"ID":"d6a50b5d-3786-4afe-8d1d-aaf00b05cb20","Type":"ContainerDied","Data":"da212e112c0cec81d15b6cc25bd7726e5cef2996208c9e5bdcbff76af2d68e6b"} Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.457587 4707 scope.go:117] "RemoveContainer" containerID="9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.477623 4707 scope.go:117] "RemoveContainer" containerID="e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.496506 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c94q9"] Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.508127 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c94q9"] Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.522382 4707 scope.go:117] "RemoveContainer" containerID="9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.553976 4707 scope.go:117] "RemoveContainer" containerID="9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d" Nov 27 17:12:58 crc kubenswrapper[4707]: E1127 17:12:58.554358 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d\": container with ID starting with 9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d not found: ID does not exist" containerID="9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.554516 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d"} err="failed to get container status \"9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d\": rpc error: code = NotFound desc = could not find container \"9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d\": container with ID starting with 9d5b1084c4113700123ed466fe5b800e42980237ecf12d6dd881ad3848ac965d not found: ID does not exist" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.554545 4707 scope.go:117] "RemoveContainer" containerID="e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86" Nov 27 17:12:58 crc kubenswrapper[4707]: E1127 17:12:58.554918 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86\": container with ID starting with e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86 not found: ID does not exist" containerID="e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.554960 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86"} err="failed to get container status \"e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86\": rpc error: code = NotFound desc = could not find container \"e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86\": container with ID starting with e46bbe1332e5f7efddc38bc249ddcccdbdeb079d6a1ef139e26dfcdce3549f86 not found: ID does not exist" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.554986 4707 scope.go:117] "RemoveContainer" containerID="9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a" Nov 27 17:12:58 crc kubenswrapper[4707]: E1127 17:12:58.555361 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a\": container with ID starting with 9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a not found: ID does not exist" containerID="9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a" Nov 27 17:12:58 crc kubenswrapper[4707]: I1127 17:12:58.555452 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a"} err="failed to get container status \"9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a\": rpc error: code = NotFound desc = could not find container \"9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a\": container with ID starting with 9314950d6dc6acfb2abc5f4be6904fcc0516e33264be4b24d526b252a1270d0a not found: ID does not exist" Nov 27 17:12:59 crc kubenswrapper[4707]: I1127 17:12:59.207110 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" path="/var/lib/kubelet/pods/d6a50b5d-3786-4afe-8d1d-aaf00b05cb20/volumes" Nov 27 17:13:03 crc kubenswrapper[4707]: I1127 17:13:03.623730 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:13:03 crc kubenswrapper[4707]: I1127 17:13:03.624456 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:13:15 crc kubenswrapper[4707]: I1127 17:13:15.501824 4707 scope.go:117] "RemoveContainer" containerID="9287b1eb3cdc0b2ca6c4589583eb166c5dba968e8d16bbaf0c59252b8a764009" Nov 27 17:13:33 crc kubenswrapper[4707]: I1127 17:13:33.623725 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:13:33 crc kubenswrapper[4707]: I1127 17:13:33.624271 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.347554 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngmsd"] Nov 27 17:13:56 crc kubenswrapper[4707]: E1127 17:13:56.348648 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerName="extract-content" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.348668 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerName="extract-content" Nov 27 17:13:56 crc kubenswrapper[4707]: E1127 17:13:56.348696 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerName="extract-utilities" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.348705 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerName="extract-utilities" Nov 27 17:13:56 crc kubenswrapper[4707]: E1127 17:13:56.348746 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerName="registry-server" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.348757 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerName="registry-server" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.348999 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a50b5d-3786-4afe-8d1d-aaf00b05cb20" containerName="registry-server" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.350922 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.361499 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngmsd"] Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.491498 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-utilities\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.491925 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-catalog-content\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.492037 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd8hl\" (UniqueName: \"kubernetes.io/projected/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-kube-api-access-sd8hl\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.593960 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-catalog-content\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.594035 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd8hl\" (UniqueName: \"kubernetes.io/projected/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-kube-api-access-sd8hl\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.594151 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-utilities\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.594584 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-catalog-content\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.594671 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-utilities\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.614712 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd8hl\" (UniqueName: \"kubernetes.io/projected/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-kube-api-access-sd8hl\") pod \"redhat-operators-ngmsd\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:56 crc kubenswrapper[4707]: I1127 17:13:56.689510 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:13:57 crc kubenswrapper[4707]: I1127 17:13:57.211534 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngmsd"] Nov 27 17:13:58 crc kubenswrapper[4707]: I1127 17:13:58.138527 4707 generic.go:334] "Generic (PLEG): container finished" podID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerID="cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614" exitCode=0 Nov 27 17:13:58 crc kubenswrapper[4707]: I1127 17:13:58.138635 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngmsd" event={"ID":"a60493f4-8e86-4c87-a4ad-353c16ffcc3c","Type":"ContainerDied","Data":"cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614"} Nov 27 17:13:58 crc kubenswrapper[4707]: I1127 17:13:58.138948 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngmsd" event={"ID":"a60493f4-8e86-4c87-a4ad-353c16ffcc3c","Type":"ContainerStarted","Data":"ebc81b5c0f1c0528e405bb604a08030803e8fa2d91f1459524f609c807473992"} Nov 27 17:14:01 crc kubenswrapper[4707]: I1127 17:14:01.171765 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngmsd" event={"ID":"a60493f4-8e86-4c87-a4ad-353c16ffcc3c","Type":"ContainerStarted","Data":"896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1"} Nov 27 17:14:03 crc kubenswrapper[4707]: I1127 17:14:03.623578 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:14:03 crc kubenswrapper[4707]: I1127 17:14:03.623977 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:14:03 crc kubenswrapper[4707]: I1127 17:14:03.624043 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 17:14:03 crc kubenswrapper[4707]: I1127 17:14:03.625126 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7df475a96b4c84af5dc879adfdcbbe0e67a53f0090543ba36dc812a5f72b0be9"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:14:03 crc kubenswrapper[4707]: I1127 17:14:03.625229 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://7df475a96b4c84af5dc879adfdcbbe0e67a53f0090543ba36dc812a5f72b0be9" gracePeriod=600 Nov 27 17:14:04 crc kubenswrapper[4707]: I1127 17:14:04.207534 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="7df475a96b4c84af5dc879adfdcbbe0e67a53f0090543ba36dc812a5f72b0be9" exitCode=0 Nov 27 17:14:04 crc kubenswrapper[4707]: I1127 17:14:04.207709 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"7df475a96b4c84af5dc879adfdcbbe0e67a53f0090543ba36dc812a5f72b0be9"} Nov 27 17:14:04 crc kubenswrapper[4707]: I1127 17:14:04.207798 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2"} Nov 27 17:14:04 crc kubenswrapper[4707]: I1127 17:14:04.207827 4707 scope.go:117] "RemoveContainer" containerID="786824b12a8a2896c46704bdf83468e2fc0bd47b51cb67ff29cb9d7963ea6d9c" Nov 27 17:14:05 crc kubenswrapper[4707]: I1127 17:14:05.223818 4707 generic.go:334] "Generic (PLEG): container finished" podID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerID="896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1" exitCode=0 Nov 27 17:14:05 crc kubenswrapper[4707]: I1127 17:14:05.224138 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngmsd" event={"ID":"a60493f4-8e86-4c87-a4ad-353c16ffcc3c","Type":"ContainerDied","Data":"896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1"} Nov 27 17:14:07 crc kubenswrapper[4707]: I1127 17:14:07.246091 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngmsd" event={"ID":"a60493f4-8e86-4c87-a4ad-353c16ffcc3c","Type":"ContainerStarted","Data":"ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c"} Nov 27 17:14:07 crc kubenswrapper[4707]: I1127 17:14:07.274439 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngmsd" podStartSLOduration=3.72733939 podStartE2EDuration="11.274405818s" podCreationTimestamp="2025-11-27 17:13:56 +0000 UTC" firstStartedPulling="2025-11-27 17:13:58.141014255 +0000 UTC m=+4213.772463053" lastFinishedPulling="2025-11-27 17:14:05.688080713 +0000 UTC m=+4221.319529481" observedRunningTime="2025-11-27 17:14:07.270333708 +0000 UTC m=+4222.901782476" watchObservedRunningTime="2025-11-27 17:14:07.274405818 +0000 UTC m=+4222.905854626" Nov 27 17:14:16 crc kubenswrapper[4707]: I1127 17:14:16.690413 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:14:16 crc kubenswrapper[4707]: I1127 17:14:16.691191 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:14:17 crc kubenswrapper[4707]: I1127 17:14:17.306437 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:14:17 crc kubenswrapper[4707]: I1127 17:14:17.420573 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:14:17 crc kubenswrapper[4707]: I1127 17:14:17.542861 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngmsd"] Nov 27 17:14:19 crc kubenswrapper[4707]: I1127 17:14:19.378904 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngmsd" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerName="registry-server" containerID="cri-o://ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c" gracePeriod=2 Nov 27 17:14:19 crc kubenswrapper[4707]: I1127 17:14:19.921097 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:14:19 crc kubenswrapper[4707]: I1127 17:14:19.997170 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd8hl\" (UniqueName: \"kubernetes.io/projected/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-kube-api-access-sd8hl\") pod \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " Nov 27 17:14:19 crc kubenswrapper[4707]: I1127 17:14:19.997357 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-utilities\") pod \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " Nov 27 17:14:19 crc kubenswrapper[4707]: I1127 17:14:19.997510 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-catalog-content\") pod \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\" (UID: \"a60493f4-8e86-4c87-a4ad-353c16ffcc3c\") " Nov 27 17:14:19 crc kubenswrapper[4707]: I1127 17:14:19.998416 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-utilities" (OuterVolumeSpecName: "utilities") pod "a60493f4-8e86-4c87-a4ad-353c16ffcc3c" (UID: "a60493f4-8e86-4c87-a4ad-353c16ffcc3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.002771 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-kube-api-access-sd8hl" (OuterVolumeSpecName: "kube-api-access-sd8hl") pod "a60493f4-8e86-4c87-a4ad-353c16ffcc3c" (UID: "a60493f4-8e86-4c87-a4ad-353c16ffcc3c"). InnerVolumeSpecName "kube-api-access-sd8hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.099313 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.099554 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd8hl\" (UniqueName: \"kubernetes.io/projected/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-kube-api-access-sd8hl\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.108879 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a60493f4-8e86-4c87-a4ad-353c16ffcc3c" (UID: "a60493f4-8e86-4c87-a4ad-353c16ffcc3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.201893 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a60493f4-8e86-4c87-a4ad-353c16ffcc3c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.390115 4707 generic.go:334] "Generic (PLEG): container finished" podID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerID="ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c" exitCode=0 Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.390167 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngmsd" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.390172 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngmsd" event={"ID":"a60493f4-8e86-4c87-a4ad-353c16ffcc3c","Type":"ContainerDied","Data":"ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c"} Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.390230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngmsd" event={"ID":"a60493f4-8e86-4c87-a4ad-353c16ffcc3c","Type":"ContainerDied","Data":"ebc81b5c0f1c0528e405bb604a08030803e8fa2d91f1459524f609c807473992"} Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.390250 4707 scope.go:117] "RemoveContainer" containerID="ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.412557 4707 scope.go:117] "RemoveContainer" containerID="896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.427011 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngmsd"] Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.436510 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngmsd"] Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.449956 4707 scope.go:117] "RemoveContainer" containerID="cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.493161 4707 scope.go:117] "RemoveContainer" containerID="ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c" Nov 27 17:14:20 crc kubenswrapper[4707]: E1127 17:14:20.495018 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c\": container with ID starting with ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c not found: ID does not exist" containerID="ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.495087 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c"} err="failed to get container status \"ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c\": rpc error: code = NotFound desc = could not find container \"ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c\": container with ID starting with ad36d7c1db492bde35acb25b55ed8d10d1c9d45661e2fa710c75cb84a7e7408c not found: ID does not exist" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.495128 4707 scope.go:117] "RemoveContainer" containerID="896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1" Nov 27 17:14:20 crc kubenswrapper[4707]: E1127 17:14:20.495491 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1\": container with ID starting with 896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1 not found: ID does not exist" containerID="896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.495528 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1"} err="failed to get container status \"896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1\": rpc error: code = NotFound desc = could not find container \"896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1\": container with ID starting with 896291cdfc1dea09e68bcad8099f9148249da5611370e8599353fba6c3f295a1 not found: ID does not exist" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.495549 4707 scope.go:117] "RemoveContainer" containerID="cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614" Nov 27 17:14:20 crc kubenswrapper[4707]: E1127 17:14:20.495771 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614\": container with ID starting with cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614 not found: ID does not exist" containerID="cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614" Nov 27 17:14:20 crc kubenswrapper[4707]: I1127 17:14:20.495794 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614"} err="failed to get container status \"cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614\": rpc error: code = NotFound desc = could not find container \"cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614\": container with ID starting with cb9cf1562717956b8d63fce67caaed45eade74f0af4a595e97f728318c6ff614 not found: ID does not exist" Nov 27 17:14:21 crc kubenswrapper[4707]: I1127 17:14:21.206177 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" path="/var/lib/kubelet/pods/a60493f4-8e86-4c87-a4ad-353c16ffcc3c/volumes" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.002803 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fblqb/must-gather-l6msz"] Nov 27 17:14:28 crc kubenswrapper[4707]: E1127 17:14:28.003831 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerName="extract-utilities" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.003849 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerName="extract-utilities" Nov 27 17:14:28 crc kubenswrapper[4707]: E1127 17:14:28.003869 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerName="registry-server" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.003878 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerName="registry-server" Nov 27 17:14:28 crc kubenswrapper[4707]: E1127 17:14:28.003896 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerName="extract-content" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.003905 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerName="extract-content" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.004160 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a60493f4-8e86-4c87-a4ad-353c16ffcc3c" containerName="registry-server" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.007514 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.010748 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fblqb"/"default-dockercfg-nbwpq" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.010860 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fblqb"/"kube-root-ca.crt" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.013799 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fblqb"/"openshift-service-ca.crt" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.038584 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fblqb/must-gather-l6msz"] Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.062430 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2bda7b9-df89-4f33-b48b-4930fe74f710-must-gather-output\") pod \"must-gather-l6msz\" (UID: \"e2bda7b9-df89-4f33-b48b-4930fe74f710\") " pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.062599 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mssnj\" (UniqueName: \"kubernetes.io/projected/e2bda7b9-df89-4f33-b48b-4930fe74f710-kube-api-access-mssnj\") pod \"must-gather-l6msz\" (UID: \"e2bda7b9-df89-4f33-b48b-4930fe74f710\") " pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.164795 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mssnj\" (UniqueName: \"kubernetes.io/projected/e2bda7b9-df89-4f33-b48b-4930fe74f710-kube-api-access-mssnj\") pod \"must-gather-l6msz\" (UID: \"e2bda7b9-df89-4f33-b48b-4930fe74f710\") " pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.164885 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2bda7b9-df89-4f33-b48b-4930fe74f710-must-gather-output\") pod \"must-gather-l6msz\" (UID: \"e2bda7b9-df89-4f33-b48b-4930fe74f710\") " pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.165631 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2bda7b9-df89-4f33-b48b-4930fe74f710-must-gather-output\") pod \"must-gather-l6msz\" (UID: \"e2bda7b9-df89-4f33-b48b-4930fe74f710\") " pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.185284 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mssnj\" (UniqueName: \"kubernetes.io/projected/e2bda7b9-df89-4f33-b48b-4930fe74f710-kube-api-access-mssnj\") pod \"must-gather-l6msz\" (UID: \"e2bda7b9-df89-4f33-b48b-4930fe74f710\") " pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.357798 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:14:28 crc kubenswrapper[4707]: I1127 17:14:28.726798 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fblqb/must-gather-l6msz"] Nov 27 17:14:29 crc kubenswrapper[4707]: I1127 17:14:29.523630 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/must-gather-l6msz" event={"ID":"e2bda7b9-df89-4f33-b48b-4930fe74f710","Type":"ContainerStarted","Data":"fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b"} Nov 27 17:14:29 crc kubenswrapper[4707]: I1127 17:14:29.524670 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/must-gather-l6msz" event={"ID":"e2bda7b9-df89-4f33-b48b-4930fe74f710","Type":"ContainerStarted","Data":"0e18c78d42e67e1e7e8a796c7f58c30a5dfb9489f1009b406336410c5ee5a6db"} Nov 27 17:14:30 crc kubenswrapper[4707]: I1127 17:14:30.541610 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/must-gather-l6msz" event={"ID":"e2bda7b9-df89-4f33-b48b-4930fe74f710","Type":"ContainerStarted","Data":"da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd"} Nov 27 17:14:30 crc kubenswrapper[4707]: I1127 17:14:30.568250 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fblqb/must-gather-l6msz" podStartSLOduration=3.568226416 podStartE2EDuration="3.568226416s" podCreationTimestamp="2025-11-27 17:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:14:30.561722796 +0000 UTC m=+4246.193171584" watchObservedRunningTime="2025-11-27 17:14:30.568226416 +0000 UTC m=+4246.199675184" Nov 27 17:14:32 crc kubenswrapper[4707]: I1127 17:14:32.835491 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fblqb/crc-debug-mbrt4"] Nov 27 17:14:32 crc kubenswrapper[4707]: I1127 17:14:32.837243 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:32 crc kubenswrapper[4707]: I1127 17:14:32.965481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dqcn\" (UniqueName: \"kubernetes.io/projected/e86a53b6-30a5-4c0b-9869-6980678e859f-kube-api-access-4dqcn\") pod \"crc-debug-mbrt4\" (UID: \"e86a53b6-30a5-4c0b-9869-6980678e859f\") " pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:32 crc kubenswrapper[4707]: I1127 17:14:32.965611 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e86a53b6-30a5-4c0b-9869-6980678e859f-host\") pod \"crc-debug-mbrt4\" (UID: \"e86a53b6-30a5-4c0b-9869-6980678e859f\") " pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:33 crc kubenswrapper[4707]: I1127 17:14:33.067645 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dqcn\" (UniqueName: \"kubernetes.io/projected/e86a53b6-30a5-4c0b-9869-6980678e859f-kube-api-access-4dqcn\") pod \"crc-debug-mbrt4\" (UID: \"e86a53b6-30a5-4c0b-9869-6980678e859f\") " pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:33 crc kubenswrapper[4707]: I1127 17:14:33.067928 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e86a53b6-30a5-4c0b-9869-6980678e859f-host\") pod \"crc-debug-mbrt4\" (UID: \"e86a53b6-30a5-4c0b-9869-6980678e859f\") " pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:33 crc kubenswrapper[4707]: I1127 17:14:33.068056 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e86a53b6-30a5-4c0b-9869-6980678e859f-host\") pod \"crc-debug-mbrt4\" (UID: \"e86a53b6-30a5-4c0b-9869-6980678e859f\") " pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:33 crc kubenswrapper[4707]: I1127 17:14:33.094338 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dqcn\" (UniqueName: \"kubernetes.io/projected/e86a53b6-30a5-4c0b-9869-6980678e859f-kube-api-access-4dqcn\") pod \"crc-debug-mbrt4\" (UID: \"e86a53b6-30a5-4c0b-9869-6980678e859f\") " pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:33 crc kubenswrapper[4707]: I1127 17:14:33.155970 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:33 crc kubenswrapper[4707]: W1127 17:14:33.207273 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode86a53b6_30a5_4c0b_9869_6980678e859f.slice/crio-bbf9bb60ee340a300225c6d343eba7915d3d918848c4ba2de58e03c483b853f0 WatchSource:0}: Error finding container bbf9bb60ee340a300225c6d343eba7915d3d918848c4ba2de58e03c483b853f0: Status 404 returned error can't find the container with id bbf9bb60ee340a300225c6d343eba7915d3d918848c4ba2de58e03c483b853f0 Nov 27 17:14:33 crc kubenswrapper[4707]: I1127 17:14:33.578806 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/crc-debug-mbrt4" event={"ID":"e86a53b6-30a5-4c0b-9869-6980678e859f","Type":"ContainerStarted","Data":"b30254c1609eef26a948d2f34cb1366253a69aeab377f83148c619b9783415ca"} Nov 27 17:14:33 crc kubenswrapper[4707]: I1127 17:14:33.579131 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/crc-debug-mbrt4" event={"ID":"e86a53b6-30a5-4c0b-9869-6980678e859f","Type":"ContainerStarted","Data":"bbf9bb60ee340a300225c6d343eba7915d3d918848c4ba2de58e03c483b853f0"} Nov 27 17:14:33 crc kubenswrapper[4707]: I1127 17:14:33.613112 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fblqb/crc-debug-mbrt4" podStartSLOduration=1.6130913630000001 podStartE2EDuration="1.613091363s" podCreationTimestamp="2025-11-27 17:14:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 17:14:33.598651179 +0000 UTC m=+4249.230099947" watchObservedRunningTime="2025-11-27 17:14:33.613091363 +0000 UTC m=+4249.244540131" Nov 27 17:14:47 crc kubenswrapper[4707]: I1127 17:14:47.706211 4707 generic.go:334] "Generic (PLEG): container finished" podID="e86a53b6-30a5-4c0b-9869-6980678e859f" containerID="b30254c1609eef26a948d2f34cb1366253a69aeab377f83148c619b9783415ca" exitCode=0 Nov 27 17:14:47 crc kubenswrapper[4707]: I1127 17:14:47.706294 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/crc-debug-mbrt4" event={"ID":"e86a53b6-30a5-4c0b-9869-6980678e859f","Type":"ContainerDied","Data":"b30254c1609eef26a948d2f34cb1366253a69aeab377f83148c619b9783415ca"} Nov 27 17:14:48 crc kubenswrapper[4707]: I1127 17:14:48.841737 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:48 crc kubenswrapper[4707]: I1127 17:14:48.877036 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fblqb/crc-debug-mbrt4"] Nov 27 17:14:48 crc kubenswrapper[4707]: I1127 17:14:48.891202 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fblqb/crc-debug-mbrt4"] Nov 27 17:14:48 crc kubenswrapper[4707]: I1127 17:14:48.903437 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e86a53b6-30a5-4c0b-9869-6980678e859f-host\") pod \"e86a53b6-30a5-4c0b-9869-6980678e859f\" (UID: \"e86a53b6-30a5-4c0b-9869-6980678e859f\") " Nov 27 17:14:48 crc kubenswrapper[4707]: I1127 17:14:48.903539 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dqcn\" (UniqueName: \"kubernetes.io/projected/e86a53b6-30a5-4c0b-9869-6980678e859f-kube-api-access-4dqcn\") pod \"e86a53b6-30a5-4c0b-9869-6980678e859f\" (UID: \"e86a53b6-30a5-4c0b-9869-6980678e859f\") " Nov 27 17:14:48 crc kubenswrapper[4707]: I1127 17:14:48.904438 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e86a53b6-30a5-4c0b-9869-6980678e859f-host" (OuterVolumeSpecName: "host") pod "e86a53b6-30a5-4c0b-9869-6980678e859f" (UID: "e86a53b6-30a5-4c0b-9869-6980678e859f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:14:48 crc kubenswrapper[4707]: I1127 17:14:48.912748 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86a53b6-30a5-4c0b-9869-6980678e859f-kube-api-access-4dqcn" (OuterVolumeSpecName: "kube-api-access-4dqcn") pod "e86a53b6-30a5-4c0b-9869-6980678e859f" (UID: "e86a53b6-30a5-4c0b-9869-6980678e859f"). InnerVolumeSpecName "kube-api-access-4dqcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:14:49 crc kubenswrapper[4707]: I1127 17:14:49.005795 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e86a53b6-30a5-4c0b-9869-6980678e859f-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:49 crc kubenswrapper[4707]: I1127 17:14:49.005830 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dqcn\" (UniqueName: \"kubernetes.io/projected/e86a53b6-30a5-4c0b-9869-6980678e859f-kube-api-access-4dqcn\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:49 crc kubenswrapper[4707]: I1127 17:14:49.207463 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86a53b6-30a5-4c0b-9869-6980678e859f" path="/var/lib/kubelet/pods/e86a53b6-30a5-4c0b-9869-6980678e859f/volumes" Nov 27 17:14:49 crc kubenswrapper[4707]: I1127 17:14:49.725575 4707 scope.go:117] "RemoveContainer" containerID="b30254c1609eef26a948d2f34cb1366253a69aeab377f83148c619b9783415ca" Nov 27 17:14:49 crc kubenswrapper[4707]: I1127 17:14:49.725630 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/crc-debug-mbrt4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.057747 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fblqb/crc-debug-h7np4"] Nov 27 17:14:50 crc kubenswrapper[4707]: E1127 17:14:50.058485 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86a53b6-30a5-4c0b-9869-6980678e859f" containerName="container-00" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.058499 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86a53b6-30a5-4c0b-9869-6980678e859f" containerName="container-00" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.058717 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86a53b6-30a5-4c0b-9869-6980678e859f" containerName="container-00" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.059385 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.134129 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftrgx\" (UniqueName: \"kubernetes.io/projected/a8b3d570-1583-42c4-b770-41f6d9ed04ec-kube-api-access-ftrgx\") pod \"crc-debug-h7np4\" (UID: \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\") " pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.134202 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8b3d570-1583-42c4-b770-41f6d9ed04ec-host\") pod \"crc-debug-h7np4\" (UID: \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\") " pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.236778 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftrgx\" (UniqueName: \"kubernetes.io/projected/a8b3d570-1583-42c4-b770-41f6d9ed04ec-kube-api-access-ftrgx\") pod \"crc-debug-h7np4\" (UID: \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\") " pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.236916 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8b3d570-1583-42c4-b770-41f6d9ed04ec-host\") pod \"crc-debug-h7np4\" (UID: \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\") " pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.237010 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8b3d570-1583-42c4-b770-41f6d9ed04ec-host\") pod \"crc-debug-h7np4\" (UID: \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\") " pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.459522 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftrgx\" (UniqueName: \"kubernetes.io/projected/a8b3d570-1583-42c4-b770-41f6d9ed04ec-kube-api-access-ftrgx\") pod \"crc-debug-h7np4\" (UID: \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\") " pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.675614 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:50 crc kubenswrapper[4707]: I1127 17:14:50.735283 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/crc-debug-h7np4" event={"ID":"a8b3d570-1583-42c4-b770-41f6d9ed04ec","Type":"ContainerStarted","Data":"9c14bf4b8605d1e0b5bcb2998ba184c7989413b4750e30518338c6d038f44a44"} Nov 27 17:14:51 crc kubenswrapper[4707]: I1127 17:14:51.748460 4707 generic.go:334] "Generic (PLEG): container finished" podID="a8b3d570-1583-42c4-b770-41f6d9ed04ec" containerID="45b67491e07eca8c202257a22afc8faf05eb844caa168f15ee87bd204232ed49" exitCode=1 Nov 27 17:14:51 crc kubenswrapper[4707]: I1127 17:14:51.748526 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/crc-debug-h7np4" event={"ID":"a8b3d570-1583-42c4-b770-41f6d9ed04ec","Type":"ContainerDied","Data":"45b67491e07eca8c202257a22afc8faf05eb844caa168f15ee87bd204232ed49"} Nov 27 17:14:51 crc kubenswrapper[4707]: I1127 17:14:51.796092 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fblqb/crc-debug-h7np4"] Nov 27 17:14:51 crc kubenswrapper[4707]: I1127 17:14:51.809629 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fblqb/crc-debug-h7np4"] Nov 27 17:14:52 crc kubenswrapper[4707]: I1127 17:14:52.865335 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:14:52 crc kubenswrapper[4707]: I1127 17:14:52.988845 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8b3d570-1583-42c4-b770-41f6d9ed04ec-host\") pod \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\" (UID: \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\") " Nov 27 17:14:52 crc kubenswrapper[4707]: I1127 17:14:52.988985 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8b3d570-1583-42c4-b770-41f6d9ed04ec-host" (OuterVolumeSpecName: "host") pod "a8b3d570-1583-42c4-b770-41f6d9ed04ec" (UID: "a8b3d570-1583-42c4-b770-41f6d9ed04ec"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 17:14:52 crc kubenswrapper[4707]: I1127 17:14:52.989270 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftrgx\" (UniqueName: \"kubernetes.io/projected/a8b3d570-1583-42c4-b770-41f6d9ed04ec-kube-api-access-ftrgx\") pod \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\" (UID: \"a8b3d570-1583-42c4-b770-41f6d9ed04ec\") " Nov 27 17:14:52 crc kubenswrapper[4707]: I1127 17:14:52.989875 4707 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a8b3d570-1583-42c4-b770-41f6d9ed04ec-host\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:52 crc kubenswrapper[4707]: I1127 17:14:52.996269 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b3d570-1583-42c4-b770-41f6d9ed04ec-kube-api-access-ftrgx" (OuterVolumeSpecName: "kube-api-access-ftrgx") pod "a8b3d570-1583-42c4-b770-41f6d9ed04ec" (UID: "a8b3d570-1583-42c4-b770-41f6d9ed04ec"). InnerVolumeSpecName "kube-api-access-ftrgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:14:53 crc kubenswrapper[4707]: I1127 17:14:53.092361 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftrgx\" (UniqueName: \"kubernetes.io/projected/a8b3d570-1583-42c4-b770-41f6d9ed04ec-kube-api-access-ftrgx\") on node \"crc\" DevicePath \"\"" Nov 27 17:14:53 crc kubenswrapper[4707]: I1127 17:14:53.208644 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b3d570-1583-42c4-b770-41f6d9ed04ec" path="/var/lib/kubelet/pods/a8b3d570-1583-42c4-b770-41f6d9ed04ec/volumes" Nov 27 17:14:53 crc kubenswrapper[4707]: I1127 17:14:53.765568 4707 scope.go:117] "RemoveContainer" containerID="45b67491e07eca8c202257a22afc8faf05eb844caa168f15ee87bd204232ed49" Nov 27 17:14:53 crc kubenswrapper[4707]: I1127 17:14:53.765610 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/crc-debug-h7np4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.176421 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4"] Nov 27 17:15:00 crc kubenswrapper[4707]: E1127 17:15:00.177407 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b3d570-1583-42c4-b770-41f6d9ed04ec" containerName="container-00" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.177423 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b3d570-1583-42c4-b770-41f6d9ed04ec" containerName="container-00" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.177657 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8b3d570-1583-42c4-b770-41f6d9ed04ec" containerName="container-00" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.178337 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.180345 4707 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.180461 4707 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.189342 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4"] Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.280421 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9dgz\" (UniqueName: \"kubernetes.io/projected/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-kube-api-access-c9dgz\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.280481 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-config-volume\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.280613 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-secret-volume\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.383237 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9dgz\" (UniqueName: \"kubernetes.io/projected/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-kube-api-access-c9dgz\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.383325 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-config-volume\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.383553 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-secret-volume\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.384517 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-config-volume\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.391891 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-secret-volume\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.416163 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9dgz\" (UniqueName: \"kubernetes.io/projected/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-kube-api-access-c9dgz\") pod \"collect-profiles-29404395-t5ts4\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:00 crc kubenswrapper[4707]: I1127 17:15:00.512641 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:01 crc kubenswrapper[4707]: I1127 17:15:01.030164 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4"] Nov 27 17:15:01 crc kubenswrapper[4707]: I1127 17:15:01.839886 4707 generic.go:334] "Generic (PLEG): container finished" podID="03bd1e9c-7e5f-4534-98c4-6cb0f058e15a" containerID="2c1db5e9270aa7ec45ded7b9e5edf2a63a19e394cf343396e25cb76ef00dc8fa" exitCode=0 Nov 27 17:15:01 crc kubenswrapper[4707]: I1127 17:15:01.839977 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" event={"ID":"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a","Type":"ContainerDied","Data":"2c1db5e9270aa7ec45ded7b9e5edf2a63a19e394cf343396e25cb76ef00dc8fa"} Nov 27 17:15:01 crc kubenswrapper[4707]: I1127 17:15:01.841119 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" event={"ID":"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a","Type":"ContainerStarted","Data":"57b1c72539517cd2628a4b8fe56b833cbd2f62ee2c3af30b911516f0cecdde8e"} Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.265224 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.438390 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-secret-volume\") pod \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.438604 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9dgz\" (UniqueName: \"kubernetes.io/projected/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-kube-api-access-c9dgz\") pod \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.438746 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-config-volume\") pod \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\" (UID: \"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a\") " Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.439361 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-config-volume" (OuterVolumeSpecName: "config-volume") pod "03bd1e9c-7e5f-4534-98c4-6cb0f058e15a" (UID: "03bd1e9c-7e5f-4534-98c4-6cb0f058e15a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.443987 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "03bd1e9c-7e5f-4534-98c4-6cb0f058e15a" (UID: "03bd1e9c-7e5f-4534-98c4-6cb0f058e15a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.444169 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-kube-api-access-c9dgz" (OuterVolumeSpecName: "kube-api-access-c9dgz") pod "03bd1e9c-7e5f-4534-98c4-6cb0f058e15a" (UID: "03bd1e9c-7e5f-4534-98c4-6cb0f058e15a"). InnerVolumeSpecName "kube-api-access-c9dgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.541625 4707 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.541671 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9dgz\" (UniqueName: \"kubernetes.io/projected/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-kube-api-access-c9dgz\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.541688 4707 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03bd1e9c-7e5f-4534-98c4-6cb0f058e15a-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.868824 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" event={"ID":"03bd1e9c-7e5f-4534-98c4-6cb0f058e15a","Type":"ContainerDied","Data":"57b1c72539517cd2628a4b8fe56b833cbd2f62ee2c3af30b911516f0cecdde8e"} Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.868862 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404395-t5ts4" Nov 27 17:15:03 crc kubenswrapper[4707]: I1127 17:15:03.868871 4707 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b1c72539517cd2628a4b8fe56b833cbd2f62ee2c3af30b911516f0cecdde8e" Nov 27 17:15:04 crc kubenswrapper[4707]: I1127 17:15:04.333236 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp"] Nov 27 17:15:04 crc kubenswrapper[4707]: I1127 17:15:04.341495 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404350-t6ghp"] Nov 27 17:15:05 crc kubenswrapper[4707]: I1127 17:15:05.205643 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834bfdd5-e6b0-4f86-b805-867827ea250e" path="/var/lib/kubelet/pods/834bfdd5-e6b0-4f86-b805-867827ea250e/volumes" Nov 27 17:15:15 crc kubenswrapper[4707]: I1127 17:15:15.853349 4707 scope.go:117] "RemoveContainer" containerID="0120f09f12a9a10173c2fabe5c119a649d640afbb06cd8614a7ce6fd9e15315b" Nov 27 17:15:56 crc kubenswrapper[4707]: I1127 17:15:56.206838 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b6ba41a2-cc99-4242-be96-b249bc657b2f/init-config-reloader/0.log" Nov 27 17:15:56 crc kubenswrapper[4707]: I1127 17:15:56.576078 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b6ba41a2-cc99-4242-be96-b249bc657b2f/alertmanager/0.log" Nov 27 17:15:56 crc kubenswrapper[4707]: I1127 17:15:56.584322 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b6ba41a2-cc99-4242-be96-b249bc657b2f/init-config-reloader/0.log" Nov 27 17:15:56 crc kubenswrapper[4707]: I1127 17:15:56.613557 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_b6ba41a2-cc99-4242-be96-b249bc657b2f/config-reloader/0.log" Nov 27 17:15:56 crc kubenswrapper[4707]: I1127 17:15:56.756335 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6282f04c-ddbf-46d2-a5ac-ba7550ff2559/aodh-api/0.log" Nov 27 17:15:56 crc kubenswrapper[4707]: I1127 17:15:56.820811 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6282f04c-ddbf-46d2-a5ac-ba7550ff2559/aodh-listener/0.log" Nov 27 17:15:56 crc kubenswrapper[4707]: I1127 17:15:56.825084 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6282f04c-ddbf-46d2-a5ac-ba7550ff2559/aodh-evaluator/0.log" Nov 27 17:15:56 crc kubenswrapper[4707]: I1127 17:15:56.932974 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6282f04c-ddbf-46d2-a5ac-ba7550ff2559/aodh-notifier/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.039748 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c695745c6-j5ntf_1050304d-e51e-4b02-9cec-828bb7d406bf/barbican-api-log/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.045320 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-c695745c6-j5ntf_1050304d-e51e-4b02-9cec-828bb7d406bf/barbican-api/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.227661 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c948b485d-2wcq8_d1df02f7-1e71-4fae-a6df-cb3c83460a7e/barbican-keystone-listener/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.281187 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c948b485d-2wcq8_d1df02f7-1e71-4fae-a6df-cb3c83460a7e/barbican-keystone-listener-log/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.476800 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-567875797-q64rz_e1955b85-6ed8-492c-8001-c4fc20da8270/barbican-worker-log/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.481480 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-567875797-q64rz_e1955b85-6ed8-492c-8001-c4fc20da8270/barbican-worker/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.593310 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fmszh_a0d367b5-ffe7-4a0d-9216-ddeb28aa8c8f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.768458 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_907095f4-0cd3-4e69-8f3f-fa908be6b6d0/ceilometer-central-agent/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.789639 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_907095f4-0cd3-4e69-8f3f-fa908be6b6d0/ceilometer-notification-agent/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.830052 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_907095f4-0cd3-4e69-8f3f-fa908be6b6d0/proxy-httpd/0.log" Nov 27 17:15:57 crc kubenswrapper[4707]: I1127 17:15:57.916691 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_907095f4-0cd3-4e69-8f3f-fa908be6b6d0/sg-core/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.028856 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_98b666a0-7de5-45af-b604-c6fa48371681/cinder-api/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.063410 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_98b666a0-7de5-45af-b604-c6fa48371681/cinder-api-log/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.341690 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7198c845-6481-4a99-b508-b3da40447ba6/probe/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.365063 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7198c845-6481-4a99-b508-b3da40447ba6/cinder-scheduler/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.515564 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-4b6vs_86c589a6-19e5-48cc-8db8-42af5ae0f078/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.613091 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-4kq59_26ad523c-9a7e-437b-a8e5-1b72a0a90d19/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.756206 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-pxkhr_3f07f078-fb9b-425d-9575-520a406e4178/init/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.923746 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-pxkhr_3f07f078-fb9b-425d-9575-520a406e4178/init/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.964628 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mwkpl_540a037b-eddb-4f11-8ed0-209cebfc0ee1/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:15:58 crc kubenswrapper[4707]: I1127 17:15:58.978636 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-pxkhr_3f07f078-fb9b-425d-9575-520a406e4178/dnsmasq-dns/0.log" Nov 27 17:15:59 crc kubenswrapper[4707]: I1127 17:15:59.526932 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f4ece211-479a-4f06-bc88-b9e50c0671f4/glance-log/0.log" Nov 27 17:15:59 crc kubenswrapper[4707]: I1127 17:15:59.537393 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f4ece211-479a-4f06-bc88-b9e50c0671f4/glance-httpd/0.log" Nov 27 17:15:59 crc kubenswrapper[4707]: I1127 17:15:59.716141 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_24849992-202f-4439-ac0d-241724235be4/glance-log/0.log" Nov 27 17:15:59 crc kubenswrapper[4707]: I1127 17:15:59.730670 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_24849992-202f-4439-ac0d-241724235be4/glance-httpd/0.log" Nov 27 17:16:00 crc kubenswrapper[4707]: I1127 17:16:00.148347 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-57bc8fcfc9-trdbf_c001365b-7c18-4d58-b516-a038ef2d6c8c/heat-api/0.log" Nov 27 17:16:00 crc kubenswrapper[4707]: I1127 17:16:00.286187 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-7597fbc9fb-5l66n_b92f69c5-3b78-463b-bb1a-7728d2cdb6ff/heat-engine/0.log" Nov 27 17:16:00 crc kubenswrapper[4707]: I1127 17:16:00.356643 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5c4f76f9fb-ghh99_7646872e-82fb-4df8-b7ce-176b3ba7fe8a/heat-cfnapi/0.log" Nov 27 17:16:00 crc kubenswrapper[4707]: I1127 17:16:00.473207 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bf29c_7382c94e-e799-4343-8548-7efd92ed66e8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:00 crc kubenswrapper[4707]: I1127 17:16:00.588188 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-wgj2d_f3e34a79-7842-4f97-91f4-040a1b4e5b2b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:00 crc kubenswrapper[4707]: I1127 17:16:00.780862 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85c75486d4-pmkdv_af1b5c91-e184-49fe-9ad9-83f047d5123d/keystone-api/0.log" Nov 27 17:16:01 crc kubenswrapper[4707]: I1127 17:16:01.311358 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29404381-kd2qt_87152575-e530-4043-87fb-c7e50bfa9f00/keystone-cron/0.log" Nov 27 17:16:01 crc kubenswrapper[4707]: I1127 17:16:01.461655 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_fb6d94a9-d059-4cba-a6f3-8590d2491bb2/kube-state-metrics/0.log" Nov 27 17:16:01 crc kubenswrapper[4707]: I1127 17:16:01.568781 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-zjmfq_18fd2519-f36c-4817-85da-7615979c3340/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:01 crc kubenswrapper[4707]: I1127 17:16:01.762689 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69588f8b9-6plc2_5de06db5-fa17-40e5-a08d-9b7f139b08ed/neutron-api/0.log" Nov 27 17:16:01 crc kubenswrapper[4707]: I1127 17:16:01.856030 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-69588f8b9-6plc2_5de06db5-fa17-40e5-a08d-9b7f139b08ed/neutron-httpd/0.log" Nov 27 17:16:02 crc kubenswrapper[4707]: I1127 17:16:02.038283 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jd52w_9f5df211-3c1b-45f3-9b61-a7fde58d8a39/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:02 crc kubenswrapper[4707]: I1127 17:16:02.311860 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0e31f3ae-2d72-4ba4-bf44-660b172c5066/nova-api-log/0.log" Nov 27 17:16:02 crc kubenswrapper[4707]: I1127 17:16:02.532631 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2d07ee1e-4a93-4fe6-909d-2cc2a11993c7/nova-cell0-conductor-conductor/0.log" Nov 27 17:16:02 crc kubenswrapper[4707]: I1127 17:16:02.739075 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0e31f3ae-2d72-4ba4-bf44-660b172c5066/nova-api-api/0.log" Nov 27 17:16:02 crc kubenswrapper[4707]: I1127 17:16:02.768313 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_af3bfed8-f098-4557-a577-0a10317ee805/nova-cell1-conductor-conductor/0.log" Nov 27 17:16:02 crc kubenswrapper[4707]: I1127 17:16:02.915304 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bd58d9e8-77d0-412d-b866-c10f989dc824/nova-cell1-novncproxy-novncproxy/0.log" Nov 27 17:16:03 crc kubenswrapper[4707]: I1127 17:16:03.021718 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lxzx6_c0d7830e-74a5-4ea0-b396-0095a96496be/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:03 crc kubenswrapper[4707]: I1127 17:16:03.209582 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3ee5c40c-527c-45b7-af82-93d55d4709c9/nova-metadata-log/0.log" Nov 27 17:16:03 crc kubenswrapper[4707]: I1127 17:16:03.627159 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:16:03 crc kubenswrapper[4707]: I1127 17:16:03.627220 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:16:03 crc kubenswrapper[4707]: I1127 17:16:03.804202 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b2d8aab4-7e14-47dd-83ad-80e0272c12cc/nova-scheduler-scheduler/0.log" Nov 27 17:16:03 crc kubenswrapper[4707]: I1127 17:16:03.848540 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4b42d58-27cb-455f-9994-ae15f433e008/mysql-bootstrap/0.log" Nov 27 17:16:03 crc kubenswrapper[4707]: I1127 17:16:03.965899 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4b42d58-27cb-455f-9994-ae15f433e008/mysql-bootstrap/0.log" Nov 27 17:16:04 crc kubenswrapper[4707]: I1127 17:16:04.065578 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4b42d58-27cb-455f-9994-ae15f433e008/galera/0.log" Nov 27 17:16:04 crc kubenswrapper[4707]: I1127 17:16:04.242251 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_84e88e9e-3edb-45cd-9973-1447587f7adc/mysql-bootstrap/0.log" Nov 27 17:16:04 crc kubenswrapper[4707]: I1127 17:16:04.401078 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_84e88e9e-3edb-45cd-9973-1447587f7adc/galera/0.log" Nov 27 17:16:04 crc kubenswrapper[4707]: I1127 17:16:04.471329 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_84e88e9e-3edb-45cd-9973-1447587f7adc/mysql-bootstrap/0.log" Nov 27 17:16:04 crc kubenswrapper[4707]: I1127 17:16:04.631110 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_247a20f8-a665-4383-944a-6fe111045aa1/openstackclient/0.log" Nov 27 17:16:04 crc kubenswrapper[4707]: I1127 17:16:04.782427 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9br7v_d951ce68-e4f2-4ead-aaef-b264f721d7a3/openstack-network-exporter/0.log" Nov 27 17:16:04 crc kubenswrapper[4707]: I1127 17:16:04.877425 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3ee5c40c-527c-45b7-af82-93d55d4709c9/nova-metadata-metadata/0.log" Nov 27 17:16:04 crc kubenswrapper[4707]: I1127 17:16:04.888211 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7n9c2_720e549c-1f41-4fb6-b29f-465ac7e174e3/ovsdb-server-init/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.104977 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7n9c2_720e549c-1f41-4fb6-b29f-465ac7e174e3/ovs-vswitchd/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.114596 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7n9c2_720e549c-1f41-4fb6-b29f-465ac7e174e3/ovsdb-server-init/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.161436 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7n9c2_720e549c-1f41-4fb6-b29f-465ac7e174e3/ovsdb-server/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.373580 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vvkr6_9639769b-4439-4ffc-b88b-cba953013bff/ovn-controller/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.397519 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-wffqs_6001ceb1-ba83-4942-a49c-d7a6116f57f5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.630848 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2/openstack-network-exporter/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.674480 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_bf62ccfb-9c5b-4c2c-a4d7-5cb9add147f2/ovn-northd/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.704528 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_379e0975-7a52-4f96-b931-4c02377d6537/openstack-network-exporter/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.876647 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_379e0975-7a52-4f96-b931-4c02377d6537/ovsdbserver-nb/0.log" Nov 27 17:16:05 crc kubenswrapper[4707]: I1127 17:16:05.910329 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6f4624a-1407-4ff8-bd7f-90f2a0fd6718/ovsdbserver-sb/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.027817 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d6f4624a-1407-4ff8-bd7f-90f2a0fd6718/openstack-network-exporter/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.252309 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69b8fb6b88-w6pxv_9246eafb-e806-45a4-bc87-9a7724b7467c/placement-api/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.299508 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-69b8fb6b88-w6pxv_9246eafb-e806-45a4-bc87-9a7724b7467c/placement-log/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.420762 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/init-config-reloader/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.627448 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/init-config-reloader/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.638834 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/prometheus/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.695389 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/thanos-sidecar/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.718626 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_eab7447a-6fd9-49e4-8db9-34e357f0c419/config-reloader/0.log" Nov 27 17:16:06 crc kubenswrapper[4707]: I1127 17:16:06.859221 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4990dbfc-6c12-4964-9d50-b8fd331cc123/setup-container/0.log" Nov 27 17:16:07 crc kubenswrapper[4707]: I1127 17:16:07.091937 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4990dbfc-6c12-4964-9d50-b8fd331cc123/rabbitmq/0.log" Nov 27 17:16:07 crc kubenswrapper[4707]: I1127 17:16:07.295075 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4990dbfc-6c12-4964-9d50-b8fd331cc123/setup-container/0.log" Nov 27 17:16:07 crc kubenswrapper[4707]: I1127 17:16:07.454679 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82ba4b51-2b4f-4ed6-8ef9-453386ff71da/setup-container/0.log" Nov 27 17:16:07 crc kubenswrapper[4707]: I1127 17:16:07.648759 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82ba4b51-2b4f-4ed6-8ef9-453386ff71da/setup-container/0.log" Nov 27 17:16:07 crc kubenswrapper[4707]: I1127 17:16:07.738160 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xdhhm_d89ddbb0-c0d3-46a8-a81e-ce2809f1352a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:07 crc kubenswrapper[4707]: I1127 17:16:07.973276 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-p954g_6f4ca349-556f-4de4-b23d-b00a59768241/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:08 crc kubenswrapper[4707]: I1127 17:16:08.109975 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-np9sx_92f7fdf9-f4c8-443c-9ff6-b6a45719b9a7/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:08 crc kubenswrapper[4707]: I1127 17:16:08.302017 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bf2x7_a03595a4-c76f-4642-b492-17f393096888/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:08 crc kubenswrapper[4707]: I1127 17:16:08.500487 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-9kfcd_29ac3bd9-0e37-4e00-aa44-a09c01019b96/ssh-known-hosts-edpm-deployment/0.log" Nov 27 17:16:08 crc kubenswrapper[4707]: I1127 17:16:08.736386 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5475fd4f89-8stjv_7362e7e3-1145-4e89-84db-343739624472/proxy-server/0.log" Nov 27 17:16:08 crc kubenswrapper[4707]: I1127 17:16:08.876464 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5475fd4f89-8stjv_7362e7e3-1145-4e89-84db-343739624472/proxy-httpd/0.log" Nov 27 17:16:09 crc kubenswrapper[4707]: I1127 17:16:09.767641 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vbngp_26d4145c-3144-4e1f-99ce-08d64f8b20be/swift-ring-rebalance/0.log" Nov 27 17:16:09 crc kubenswrapper[4707]: I1127 17:16:09.788822 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/account-auditor/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.023789 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/account-reaper/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.053889 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/account-replicator/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.110983 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/account-server/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.279930 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/container-auditor/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.324935 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/container-server/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.388226 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/container-replicator/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.501121 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/container-updater/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.520156 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_82ba4b51-2b4f-4ed6-8ef9-453386ff71da/rabbitmq/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.598483 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-auditor/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.610868 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-expirer/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.754110 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-replicator/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.782128 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-server/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.813706 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/rsync/0.log" Nov 27 17:16:10 crc kubenswrapper[4707]: I1127 17:16:10.835707 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/object-updater/0.log" Nov 27 17:16:11 crc kubenswrapper[4707]: I1127 17:16:11.458955 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d67e130-f1e2-4fe8-9647-8725402a1cdd/swift-recon-cron/0.log" Nov 27 17:16:11 crc kubenswrapper[4707]: I1127 17:16:11.535580 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-pk25z_03d3491e-8e8f-49a2-8552-f939d87bbb59/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:11 crc kubenswrapper[4707]: I1127 17:16:11.696380 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wnhjm_4e28fa93-baff-4fad-91cc-7ef262dcd775/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 17:16:19 crc kubenswrapper[4707]: I1127 17:16:19.766887 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c5402fa0-f1b7-4561-95f0-cb690caf9b58/memcached/0.log" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.468885 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nks2m"] Nov 27 17:16:28 crc kubenswrapper[4707]: E1127 17:16:28.469827 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03bd1e9c-7e5f-4534-98c4-6cb0f058e15a" containerName="collect-profiles" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.469842 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="03bd1e9c-7e5f-4534-98c4-6cb0f058e15a" containerName="collect-profiles" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.470148 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="03bd1e9c-7e5f-4534-98c4-6cb0f058e15a" containerName="collect-profiles" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.471947 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.488449 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nks2m"] Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.567145 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw6hb\" (UniqueName: \"kubernetes.io/projected/86a2c4de-6496-46a9-b73d-26c769e34f3f-kube-api-access-tw6hb\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.567280 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-catalog-content\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.567698 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-utilities\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.671216 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw6hb\" (UniqueName: \"kubernetes.io/projected/86a2c4de-6496-46a9-b73d-26c769e34f3f-kube-api-access-tw6hb\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.671280 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-catalog-content\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.671424 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-utilities\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.671695 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-catalog-content\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.672021 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-utilities\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.713592 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw6hb\" (UniqueName: \"kubernetes.io/projected/86a2c4de-6496-46a9-b73d-26c769e34f3f-kube-api-access-tw6hb\") pod \"community-operators-nks2m\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:28 crc kubenswrapper[4707]: I1127 17:16:28.789582 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:29 crc kubenswrapper[4707]: I1127 17:16:29.277244 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nks2m"] Nov 27 17:16:29 crc kubenswrapper[4707]: I1127 17:16:29.401282 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nks2m" event={"ID":"86a2c4de-6496-46a9-b73d-26c769e34f3f","Type":"ContainerStarted","Data":"70e20c4220d046fad3f5b474b880ca4efb51c19ee73a84ab824a1ed3927ed1b3"} Nov 27 17:16:30 crc kubenswrapper[4707]: I1127 17:16:30.412470 4707 generic.go:334] "Generic (PLEG): container finished" podID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerID="424066b0a5fc0cc51bd2ce99d39646ae17c79433979a6ed2b4d28ca02f11769d" exitCode=0 Nov 27 17:16:30 crc kubenswrapper[4707]: I1127 17:16:30.412698 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nks2m" event={"ID":"86a2c4de-6496-46a9-b73d-26c769e34f3f","Type":"ContainerDied","Data":"424066b0a5fc0cc51bd2ce99d39646ae17c79433979a6ed2b4d28ca02f11769d"} Nov 27 17:16:33 crc kubenswrapper[4707]: I1127 17:16:33.444132 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nks2m" event={"ID":"86a2c4de-6496-46a9-b73d-26c769e34f3f","Type":"ContainerStarted","Data":"8ea91d0da73d74574ea1a80dfca47ab8ef020fb1f5130e98d8894366fd9a4773"} Nov 27 17:16:33 crc kubenswrapper[4707]: I1127 17:16:33.623219 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:16:33 crc kubenswrapper[4707]: I1127 17:16:33.623304 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:16:37 crc kubenswrapper[4707]: I1127 17:16:37.477886 4707 generic.go:334] "Generic (PLEG): container finished" podID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerID="8ea91d0da73d74574ea1a80dfca47ab8ef020fb1f5130e98d8894366fd9a4773" exitCode=0 Nov 27 17:16:37 crc kubenswrapper[4707]: I1127 17:16:37.477956 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nks2m" event={"ID":"86a2c4de-6496-46a9-b73d-26c769e34f3f","Type":"ContainerDied","Data":"8ea91d0da73d74574ea1a80dfca47ab8ef020fb1f5130e98d8894366fd9a4773"} Nov 27 17:16:38 crc kubenswrapper[4707]: I1127 17:16:38.489462 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nks2m" event={"ID":"86a2c4de-6496-46a9-b73d-26c769e34f3f","Type":"ContainerStarted","Data":"857fb2f4f3cf5b54aa17dd3596ba1c12ec24a8098512286231e7d6143aba5219"} Nov 27 17:16:38 crc kubenswrapper[4707]: I1127 17:16:38.516046 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nks2m" podStartSLOduration=2.8622413829999998 podStartE2EDuration="10.516025626s" podCreationTimestamp="2025-11-27 17:16:28 +0000 UTC" firstStartedPulling="2025-11-27 17:16:30.414274535 +0000 UTC m=+4366.045723313" lastFinishedPulling="2025-11-27 17:16:38.068058778 +0000 UTC m=+4373.699507556" observedRunningTime="2025-11-27 17:16:38.509050915 +0000 UTC m=+4374.140499703" watchObservedRunningTime="2025-11-27 17:16:38.516025626 +0000 UTC m=+4374.147474394" Nov 27 17:16:38 crc kubenswrapper[4707]: I1127 17:16:38.790802 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:38 crc kubenswrapper[4707]: I1127 17:16:38.791126 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:39 crc kubenswrapper[4707]: I1127 17:16:39.839675 4707 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nks2m" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="registry-server" probeResult="failure" output=< Nov 27 17:16:39 crc kubenswrapper[4707]: timeout: failed to connect service ":50051" within 1s Nov 27 17:16:39 crc kubenswrapper[4707]: > Nov 27 17:16:41 crc kubenswrapper[4707]: I1127 17:16:41.519375 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/util/0.log" Nov 27 17:16:41 crc kubenswrapper[4707]: I1127 17:16:41.685194 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/pull/0.log" Nov 27 17:16:41 crc kubenswrapper[4707]: I1127 17:16:41.733733 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/pull/0.log" Nov 27 17:16:41 crc kubenswrapper[4707]: I1127 17:16:41.737153 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/util/0.log" Nov 27 17:16:41 crc kubenswrapper[4707]: I1127 17:16:41.876072 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/pull/0.log" Nov 27 17:16:41 crc kubenswrapper[4707]: I1127 17:16:41.889840 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/util/0.log" Nov 27 17:16:41 crc kubenswrapper[4707]: I1127 17:16:41.925699 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b525f43158688e23ee73385ddac5626340eda066df53bb1a8cbdb78832m5nh_82db51a9-76e7-4066-9dbc-83b27ff84cc8/extract/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.080610 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-nhkfx_0267ac3a-4bee-42b9-a506-e2b1e1e3726e/kube-rbac-proxy/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.203043 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-nhkfx_0267ac3a-4bee-42b9-a506-e2b1e1e3726e/manager/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.257354 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-sgwpt_6d78cbb1-56e6-428d-bba4-5d1edbbda363/kube-rbac-proxy/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.320102 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-sgwpt_6d78cbb1-56e6-428d-bba4-5d1edbbda363/manager/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.407908 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-f5g2d_5e9f9859-1f28-4183-b71c-e9459e2746b7/kube-rbac-proxy/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.468452 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-f5g2d_5e9f9859-1f28-4183-b71c-e9459e2746b7/manager/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.595301 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-cd4lh_99785491-bcbd-4946-b1a6-a3e08a4394b5/kube-rbac-proxy/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.721488 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-cd4lh_99785491-bcbd-4946-b1a6-a3e08a4394b5/manager/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.785398 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-8qhwz_a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660/kube-rbac-proxy/0.log" Nov 27 17:16:42 crc kubenswrapper[4707]: I1127 17:16:42.872411 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-8qhwz_a2c3c27e-3e0c-4d4e-a5f7-3deaaae0f660/manager/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.162879 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-hg8j4_ac946592-ee39-443e-b64a-980caaace080/kube-rbac-proxy/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.204604 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-hg8j4_ac946592-ee39-443e-b64a-980caaace080/manager/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.368000 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-kwlbb_88f24787-fedc-4d08-9a8e-16a24f242d02/kube-rbac-proxy/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.547073 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-5bj45_8fea437e-0a8c-4836-b23c-56db9c7ea0fc/kube-rbac-proxy/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.552404 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-kwlbb_88f24787-fedc-4d08-9a8e-16a24f242d02/manager/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.610173 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-5bj45_8fea437e-0a8c-4836-b23c-56db9c7ea0fc/manager/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.797160 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-4cd96_dc2542ce-f2fd-454b-b47f-92d3bbc93d91/manager/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.798622 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-4cd96_dc2542ce-f2fd-454b-b47f-92d3bbc93d91/kube-rbac-proxy/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.882981 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-x5k9j_126e10c6-2740-47fc-8331-a8e4bb6549b8/kube-rbac-proxy/0.log" Nov 27 17:16:43 crc kubenswrapper[4707]: I1127 17:16:43.972743 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-x5k9j_126e10c6-2740-47fc-8331-a8e4bb6549b8/manager/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.016657 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-dfw47_50771ff9-4409-4f41-ad3c-98f730dbff77/kube-rbac-proxy/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.174239 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-dfw47_50771ff9-4409-4f41-ad3c-98f730dbff77/manager/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.220767 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-4bqw7_9377949b-5979-44ff-bd3f-ea1389b4ef6f/kube-rbac-proxy/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.273846 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-4bqw7_9377949b-5979-44ff-bd3f-ea1389b4ef6f/manager/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.400077 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-q6vcr_20e41446-8d89-481e-bd9f-48dc14efb82e/kube-rbac-proxy/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.523494 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-q6vcr_20e41446-8d89-481e-bd9f-48dc14efb82e/manager/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.642668 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-blnpn_7c88f676-b4d3-46b2-aedd-eff62f8f1bfb/manager/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.659111 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-blnpn_7c88f676-b4d3-46b2-aedd-eff62f8f1bfb/kube-rbac-proxy/0.log" Nov 27 17:16:44 crc kubenswrapper[4707]: I1127 17:16:44.692039 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww_7761d2b0-8cc7-4dc8-a956-df20e2efc081/kube-rbac-proxy/0.log" Nov 27 17:16:45 crc kubenswrapper[4707]: I1127 17:16:45.360273 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6brxbww_7761d2b0-8cc7-4dc8-a956-df20e2efc081/manager/0.log" Nov 27 17:16:45 crc kubenswrapper[4707]: I1127 17:16:45.550485 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7d7f8454cc-87c8c_b517a267-0265-4e24-b102-b19b8d9eee18/operator/0.log" Nov 27 17:16:45 crc kubenswrapper[4707]: I1127 17:16:45.638952 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-qdn2b_4b76b0a5-e84e-427d-9cb4-4fac9969a278/registry-server/0.log" Nov 27 17:16:45 crc kubenswrapper[4707]: I1127 17:16:45.870201 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-nbgp6_6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841/kube-rbac-proxy/0.log" Nov 27 17:16:45 crc kubenswrapper[4707]: I1127 17:16:45.913559 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-v48r5_3c3bf501-b545-45a2-b186-2df94990295d/kube-rbac-proxy/0.log" Nov 27 17:16:45 crc kubenswrapper[4707]: I1127 17:16:45.968481 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-nbgp6_6aa8bbf9-4d33-4b2e-b53a-d6a341b3d841/manager/0.log" Nov 27 17:16:46 crc kubenswrapper[4707]: I1127 17:16:46.106783 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-v48r5_3c3bf501-b545-45a2-b186-2df94990295d/manager/0.log" Nov 27 17:16:46 crc kubenswrapper[4707]: I1127 17:16:46.118112 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-v96x7_6647f986-9d62-4939-907b-fde960b30a37/operator/0.log" Nov 27 17:16:46 crc kubenswrapper[4707]: I1127 17:16:46.355526 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-qdh2s_efa00950-13dd-4e9e-a215-6ebb89006545/kube-rbac-proxy/0.log" Nov 27 17:16:46 crc kubenswrapper[4707]: I1127 17:16:46.365777 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-qdh2s_efa00950-13dd-4e9e-a215-6ebb89006545/manager/0.log" Nov 27 17:16:46 crc kubenswrapper[4707]: I1127 17:16:46.450809 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/kube-rbac-proxy/0.log" Nov 27 17:16:46 crc kubenswrapper[4707]: I1127 17:16:46.898994 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54fcbb7454-dqgm9_894c1749-fccf-4178-b7a8-6c63e18266f6/manager/0.log" Nov 27 17:16:46 crc kubenswrapper[4707]: I1127 17:16:46.997736 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-qwb4t_97bb6c80-6996-4e91-bcdf-0f1c20e72fa3/kube-rbac-proxy/0.log" Nov 27 17:16:46 crc kubenswrapper[4707]: I1127 17:16:46.997850 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-qwb4t_97bb6c80-6996-4e91-bcdf-0f1c20e72fa3/manager/0.log" Nov 27 17:16:47 crc kubenswrapper[4707]: I1127 17:16:47.097021 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-8665cb7d49-w7xqm_94ba089c-d890-4d57-abc0-258a2b54a6f9/manager/0.log" Nov 27 17:16:47 crc kubenswrapper[4707]: I1127 17:16:47.192084 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-hgbtk_00e99b4b-2bbd-445a-b075-74c47fe30f79/manager/0.log" Nov 27 17:16:47 crc kubenswrapper[4707]: I1127 17:16:47.195571 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-hgbtk_00e99b4b-2bbd-445a-b075-74c47fe30f79/kube-rbac-proxy/0.log" Nov 27 17:16:48 crc kubenswrapper[4707]: I1127 17:16:48.839947 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:48 crc kubenswrapper[4707]: I1127 17:16:48.890576 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:49 crc kubenswrapper[4707]: I1127 17:16:49.077004 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nks2m"] Nov 27 17:16:50 crc kubenswrapper[4707]: I1127 17:16:50.588986 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nks2m" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="registry-server" containerID="cri-o://857fb2f4f3cf5b54aa17dd3596ba1c12ec24a8098512286231e7d6143aba5219" gracePeriod=2 Nov 27 17:16:53 crc kubenswrapper[4707]: I1127 17:16:53.616283 4707 generic.go:334] "Generic (PLEG): container finished" podID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerID="857fb2f4f3cf5b54aa17dd3596ba1c12ec24a8098512286231e7d6143aba5219" exitCode=0 Nov 27 17:16:53 crc kubenswrapper[4707]: I1127 17:16:53.616361 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nks2m" event={"ID":"86a2c4de-6496-46a9-b73d-26c769e34f3f","Type":"ContainerDied","Data":"857fb2f4f3cf5b54aa17dd3596ba1c12ec24a8098512286231e7d6143aba5219"} Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.763859 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.776142 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-catalog-content\") pod \"86a2c4de-6496-46a9-b73d-26c769e34f3f\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.776358 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-utilities\") pod \"86a2c4de-6496-46a9-b73d-26c769e34f3f\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.776405 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw6hb\" (UniqueName: \"kubernetes.io/projected/86a2c4de-6496-46a9-b73d-26c769e34f3f-kube-api-access-tw6hb\") pod \"86a2c4de-6496-46a9-b73d-26c769e34f3f\" (UID: \"86a2c4de-6496-46a9-b73d-26c769e34f3f\") " Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.776863 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-utilities" (OuterVolumeSpecName: "utilities") pod "86a2c4de-6496-46a9-b73d-26c769e34f3f" (UID: "86a2c4de-6496-46a9-b73d-26c769e34f3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.786149 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a2c4de-6496-46a9-b73d-26c769e34f3f-kube-api-access-tw6hb" (OuterVolumeSpecName: "kube-api-access-tw6hb") pod "86a2c4de-6496-46a9-b73d-26c769e34f3f" (UID: "86a2c4de-6496-46a9-b73d-26c769e34f3f"). InnerVolumeSpecName "kube-api-access-tw6hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.844202 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86a2c4de-6496-46a9-b73d-26c769e34f3f" (UID: "86a2c4de-6496-46a9-b73d-26c769e34f3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.878533 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw6hb\" (UniqueName: \"kubernetes.io/projected/86a2c4de-6496-46a9-b73d-26c769e34f3f-kube-api-access-tw6hb\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.878565 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:54 crc kubenswrapper[4707]: I1127 17:16:54.878575 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86a2c4de-6496-46a9-b73d-26c769e34f3f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:16:55 crc kubenswrapper[4707]: I1127 17:16:55.636230 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nks2m" event={"ID":"86a2c4de-6496-46a9-b73d-26c769e34f3f","Type":"ContainerDied","Data":"70e20c4220d046fad3f5b474b880ca4efb51c19ee73a84ab824a1ed3927ed1b3"} Nov 27 17:16:55 crc kubenswrapper[4707]: I1127 17:16:55.636571 4707 scope.go:117] "RemoveContainer" containerID="857fb2f4f3cf5b54aa17dd3596ba1c12ec24a8098512286231e7d6143aba5219" Nov 27 17:16:55 crc kubenswrapper[4707]: I1127 17:16:55.636395 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nks2m" Nov 27 17:16:55 crc kubenswrapper[4707]: I1127 17:16:55.662788 4707 scope.go:117] "RemoveContainer" containerID="8ea91d0da73d74574ea1a80dfca47ab8ef020fb1f5130e98d8894366fd9a4773" Nov 27 17:16:55 crc kubenswrapper[4707]: I1127 17:16:55.666825 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nks2m"] Nov 27 17:16:55 crc kubenswrapper[4707]: I1127 17:16:55.678342 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nks2m"] Nov 27 17:16:55 crc kubenswrapper[4707]: I1127 17:16:55.685011 4707 scope.go:117] "RemoveContainer" containerID="424066b0a5fc0cc51bd2ce99d39646ae17c79433979a6ed2b4d28ca02f11769d" Nov 27 17:16:57 crc kubenswrapper[4707]: I1127 17:16:57.214733 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" path="/var/lib/kubelet/pods/86a2c4de-6496-46a9-b73d-26c769e34f3f/volumes" Nov 27 17:17:03 crc kubenswrapper[4707]: I1127 17:17:03.623235 4707 patch_prober.go:28] interesting pod/machine-config-daemon-c995m container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 17:17:03 crc kubenswrapper[4707]: I1127 17:17:03.623688 4707 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 17:17:03 crc kubenswrapper[4707]: I1127 17:17:03.623733 4707 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c995m" Nov 27 17:17:03 crc kubenswrapper[4707]: I1127 17:17:03.624459 4707 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2"} pod="openshift-machine-config-operator/machine-config-daemon-c995m" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 17:17:03 crc kubenswrapper[4707]: I1127 17:17:03.624506 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerName="machine-config-daemon" containerID="cri-o://2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" gracePeriod=600 Nov 27 17:17:03 crc kubenswrapper[4707]: E1127 17:17:03.774598 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:17:04 crc kubenswrapper[4707]: I1127 17:17:04.745958 4707 generic.go:334] "Generic (PLEG): container finished" podID="a83beb0d-8dd1-434a-ace2-933f98e3956f" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" exitCode=0 Nov 27 17:17:04 crc kubenswrapper[4707]: I1127 17:17:04.746000 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerDied","Data":"2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2"} Nov 27 17:17:04 crc kubenswrapper[4707]: I1127 17:17:04.746341 4707 scope.go:117] "RemoveContainer" containerID="7df475a96b4c84af5dc879adfdcbbe0e67a53f0090543ba36dc812a5f72b0be9" Nov 27 17:17:04 crc kubenswrapper[4707]: I1127 17:17:04.747005 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:17:04 crc kubenswrapper[4707]: E1127 17:17:04.747260 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:17:07 crc kubenswrapper[4707]: I1127 17:17:07.653687 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h746w_e8027394-2524-45df-8cdc-967024215d25/control-plane-machine-set-operator/0.log" Nov 27 17:17:07 crc kubenswrapper[4707]: I1127 17:17:07.866859 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-thf5c_78b8999d-9535-4584-baa0-5fd38838ac29/kube-rbac-proxy/0.log" Nov 27 17:17:07 crc kubenswrapper[4707]: I1127 17:17:07.880843 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-thf5c_78b8999d-9535-4584-baa0-5fd38838ac29/machine-api-operator/0.log" Nov 27 17:17:20 crc kubenswrapper[4707]: I1127 17:17:20.196061 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:17:20 crc kubenswrapper[4707]: E1127 17:17:20.196972 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:17:20 crc kubenswrapper[4707]: I1127 17:17:20.341959 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nc56g_4a4bd04a-ce38-46fe-a197-17214e851643/cert-manager-controller/0.log" Nov 27 17:17:20 crc kubenswrapper[4707]: I1127 17:17:20.485714 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-7j6cd_5f4b6258-1be2-489a-9e59-86df4534d663/cert-manager-cainjector/0.log" Nov 27 17:17:20 crc kubenswrapper[4707]: I1127 17:17:20.571489 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-5476f_9efe65d7-bb46-43bd-a343-c9a28fbad2ea/cert-manager-webhook/0.log" Nov 27 17:17:34 crc kubenswrapper[4707]: I1127 17:17:34.195258 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:17:34 crc kubenswrapper[4707]: E1127 17:17:34.196009 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:17:34 crc kubenswrapper[4707]: I1127 17:17:34.767030 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-cfrns_2de77ffe-0aaf-4a49-86a3-3bb9a0123497/nmstate-console-plugin/0.log" Nov 27 17:17:34 crc kubenswrapper[4707]: I1127 17:17:34.956306 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-svvpq_d5b862a0-5504-43b2-9f8f-fa953310a52d/nmstate-handler/0.log" Nov 27 17:17:35 crc kubenswrapper[4707]: I1127 17:17:35.034934 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-zmcdf_444e03f1-9114-4150-8f28-3db614bb32e0/kube-rbac-proxy/0.log" Nov 27 17:17:35 crc kubenswrapper[4707]: I1127 17:17:35.628943 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-v69r4_0e1c66da-d3ca-4b17-83b7-62518c83721c/nmstate-operator/0.log" Nov 27 17:17:35 crc kubenswrapper[4707]: I1127 17:17:35.643715 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-zmcdf_444e03f1-9114-4150-8f28-3db614bb32e0/nmstate-metrics/0.log" Nov 27 17:17:35 crc kubenswrapper[4707]: I1127 17:17:35.818342 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-h6v87_d051acac-59f4-434a-85bb-2cf7ec7e7107/nmstate-webhook/0.log" Nov 27 17:17:45 crc kubenswrapper[4707]: I1127 17:17:45.194856 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:17:45 crc kubenswrapper[4707]: E1127 17:17:45.195593 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:17:50 crc kubenswrapper[4707]: I1127 17:17:50.359962 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-fcnd7_0b63951d-0fad-479e-9a1d-e3978d75f5db/kube-rbac-proxy/0.log" Nov 27 17:17:50 crc kubenswrapper[4707]: I1127 17:17:50.556180 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-fcnd7_0b63951d-0fad-479e-9a1d-e3978d75f5db/controller/0.log" Nov 27 17:17:50 crc kubenswrapper[4707]: I1127 17:17:50.651303 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-frr-files/0.log" Nov 27 17:17:50 crc kubenswrapper[4707]: I1127 17:17:50.824517 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-reloader/0.log" Nov 27 17:17:50 crc kubenswrapper[4707]: I1127 17:17:50.833703 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-frr-files/0.log" Nov 27 17:17:50 crc kubenswrapper[4707]: I1127 17:17:50.892678 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-metrics/0.log" Nov 27 17:17:50 crc kubenswrapper[4707]: I1127 17:17:50.928651 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-reloader/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.059491 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-frr-files/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.089763 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-metrics/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.106516 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-reloader/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.139110 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-metrics/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.351680 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-reloader/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.353709 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-frr-files/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.371432 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/cp-metrics/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.386196 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/controller/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.727012 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/kube-rbac-proxy/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.802009 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/frr-metrics/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.802773 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/kube-rbac-proxy-frr/0.log" Nov 27 17:17:51 crc kubenswrapper[4707]: I1127 17:17:51.965554 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/reloader/0.log" Nov 27 17:17:52 crc kubenswrapper[4707]: I1127 17:17:52.053071 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-krwxc_0dfd8369-d85e-41e1-8990-123e0de5e7d4/frr-k8s-webhook-server/0.log" Nov 27 17:17:52 crc kubenswrapper[4707]: I1127 17:17:52.314475 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-669b86894d-bdzj6_f3f84476-2107-4226-be6c-cdcc6380a697/manager/0.log" Nov 27 17:17:52 crc kubenswrapper[4707]: I1127 17:17:52.492268 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7bddc5f445-p9w4t_3226579b-b878-4494-83ca-cc7288089a7a/webhook-server/0.log" Nov 27 17:17:52 crc kubenswrapper[4707]: I1127 17:17:52.550858 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m8h9q_8c251e95-c920-4c32-b7be-95367e79b151/kube-rbac-proxy/0.log" Nov 27 17:17:53 crc kubenswrapper[4707]: I1127 17:17:53.169537 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m8h9q_8c251e95-c920-4c32-b7be-95367e79b151/speaker/0.log" Nov 27 17:17:53 crc kubenswrapper[4707]: I1127 17:17:53.461175 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l5lzc_f892e6ab-ac16-4945-b724-ddca8efed111/frr/0.log" Nov 27 17:17:59 crc kubenswrapper[4707]: I1127 17:17:59.195697 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:17:59 crc kubenswrapper[4707]: E1127 17:17:59.196317 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:18:06 crc kubenswrapper[4707]: I1127 17:18:06.284981 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/util/0.log" Nov 27 17:18:06 crc kubenswrapper[4707]: I1127 17:18:06.406912 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/util/0.log" Nov 27 17:18:06 crc kubenswrapper[4707]: I1127 17:18:06.429235 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/pull/0.log" Nov 27 17:18:06 crc kubenswrapper[4707]: I1127 17:18:06.466628 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/pull/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.239635 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/extract/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.250479 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/util/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.262831 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxmdd5_367eee2c-7dc3-4a7e-a943-131037e46ca1/pull/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.402865 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/util/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.562615 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/util/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.596551 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/pull/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.597160 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/pull/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.774318 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/util/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.781163 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/extract/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.788126 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210rtnm9_578725b0-80a1-4da6-93ae-243ae76cd1b6/pull/0.log" Nov 27 17:18:07 crc kubenswrapper[4707]: I1127 17:18:07.940666 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/util/0.log" Nov 27 17:18:08 crc kubenswrapper[4707]: I1127 17:18:08.089471 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/pull/0.log" Nov 27 17:18:08 crc kubenswrapper[4707]: I1127 17:18:08.092177 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/util/0.log" Nov 27 17:18:08 crc kubenswrapper[4707]: I1127 17:18:08.122420 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/pull/0.log" Nov 27 17:18:08 crc kubenswrapper[4707]: I1127 17:18:08.250301 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/util/0.log" Nov 27 17:18:08 crc kubenswrapper[4707]: I1127 17:18:08.290681 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/pull/0.log" Nov 27 17:18:08 crc kubenswrapper[4707]: I1127 17:18:08.310373 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f838sdgh_b9fee02a-26ff-4676-843d-0159e9b2fe91/extract/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.028516 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-utilities/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.193661 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-content/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.221531 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-content/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.222328 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-utilities/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.430104 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-utilities/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.442312 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/extract-content/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.656257 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-utilities/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.825035 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-content/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.858209 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-utilities/0.log" Nov 27 17:18:09 crc kubenswrapper[4707]: I1127 17:18:09.889817 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-content/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.048071 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-utilities/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.072335 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hmrsj_c38b0f06-ab2f-48e1-9181-d9410e8896be/registry-server/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.143321 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/extract-content/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.325633 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-k5pzx_d8fb3604-08bd-4ad8-9838-8275101534c7/marketplace-operator/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.409635 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-utilities/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.628978 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-utilities/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.631914 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-content/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.692508 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-content/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.773475 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-b989k_d6b92c04-fcc0-4d96-8200-3edd228dd326/registry-server/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.878209 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-utilities/0.log" Nov 27 17:18:10 crc kubenswrapper[4707]: I1127 17:18:10.887873 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/extract-content/0.log" Nov 27 17:18:11 crc kubenswrapper[4707]: I1127 17:18:11.012102 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-utilities/0.log" Nov 27 17:18:11 crc kubenswrapper[4707]: I1127 17:18:11.057830 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zn2pc_363ac0a9-1890-4420-9afd-5a2b2ead2c51/registry-server/0.log" Nov 27 17:18:11 crc kubenswrapper[4707]: I1127 17:18:11.204979 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-utilities/0.log" Nov 27 17:18:11 crc kubenswrapper[4707]: I1127 17:18:11.226659 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-content/0.log" Nov 27 17:18:11 crc kubenswrapper[4707]: I1127 17:18:11.227064 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-content/0.log" Nov 27 17:18:11 crc kubenswrapper[4707]: I1127 17:18:11.380925 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-utilities/0.log" Nov 27 17:18:11 crc kubenswrapper[4707]: I1127 17:18:11.383614 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/extract-content/0.log" Nov 27 17:18:11 crc kubenswrapper[4707]: I1127 17:18:11.980085 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zthlf_61f2d660-6553-4b21-b002-7bbc56c701ea/registry-server/0.log" Nov 27 17:18:13 crc kubenswrapper[4707]: I1127 17:18:13.195267 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:18:13 crc kubenswrapper[4707]: E1127 17:18:13.196722 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:18:23 crc kubenswrapper[4707]: I1127 17:18:23.802534 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-zwnr6_fb96b00d-b40a-4af9-8bc5-8b4dfbe2a5fa/prometheus-operator/0.log" Nov 27 17:18:24 crc kubenswrapper[4707]: I1127 17:18:24.010900 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7677977bcd-fx2tz_e1ab71b3-d476-4099-b4d8-c59b437dd4f7/prometheus-operator-admission-webhook/0.log" Nov 27 17:18:24 crc kubenswrapper[4707]: I1127 17:18:24.071647 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7677977bcd-tp2hp_4208373c-9a43-406a-8262-0aff47b60f85/prometheus-operator-admission-webhook/0.log" Nov 27 17:18:24 crc kubenswrapper[4707]: I1127 17:18:24.162417 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-82wd5_e96caaf7-dd4f-4734-8601-23d38df9005f/operator/0.log" Nov 27 17:18:24 crc kubenswrapper[4707]: I1127 17:18:24.272881 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-jgbmm_0c162a13-e55c-44f0-9ab7-cc7ce6d87605/perses-operator/0.log" Nov 27 17:18:26 crc kubenswrapper[4707]: I1127 17:18:26.195738 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:18:26 crc kubenswrapper[4707]: E1127 17:18:26.196253 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:18:38 crc kubenswrapper[4707]: I1127 17:18:38.195532 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:18:38 crc kubenswrapper[4707]: E1127 17:18:38.196423 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:18:51 crc kubenswrapper[4707]: I1127 17:18:51.194747 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:18:51 crc kubenswrapper[4707]: E1127 17:18:51.195604 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:19:02 crc kubenswrapper[4707]: I1127 17:19:02.196234 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:19:02 crc kubenswrapper[4707]: E1127 17:19:02.197275 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:19:13 crc kubenswrapper[4707]: I1127 17:19:13.195809 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:19:13 crc kubenswrapper[4707]: E1127 17:19:13.197730 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:19:25 crc kubenswrapper[4707]: I1127 17:19:25.218550 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:19:25 crc kubenswrapper[4707]: E1127 17:19:25.219512 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:19:36 crc kubenswrapper[4707]: I1127 17:19:36.195719 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:19:36 crc kubenswrapper[4707]: E1127 17:19:36.196903 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:19:50 crc kubenswrapper[4707]: I1127 17:19:50.196162 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:19:50 crc kubenswrapper[4707]: E1127 17:19:50.196929 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:19:59 crc kubenswrapper[4707]: I1127 17:19:59.515627 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerID="fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b" exitCode=0 Nov 27 17:19:59 crc kubenswrapper[4707]: I1127 17:19:59.515684 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fblqb/must-gather-l6msz" event={"ID":"e2bda7b9-df89-4f33-b48b-4930fe74f710","Type":"ContainerDied","Data":"fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b"} Nov 27 17:19:59 crc kubenswrapper[4707]: I1127 17:19:59.516693 4707 scope.go:117] "RemoveContainer" containerID="fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b" Nov 27 17:20:00 crc kubenswrapper[4707]: I1127 17:20:00.480119 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fblqb_must-gather-l6msz_e2bda7b9-df89-4f33-b48b-4930fe74f710/gather/0.log" Nov 27 17:20:03 crc kubenswrapper[4707]: I1127 17:20:03.196306 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:20:03 crc kubenswrapper[4707]: E1127 17:20:03.196828 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:20:11 crc kubenswrapper[4707]: I1127 17:20:11.859453 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fblqb/must-gather-l6msz"] Nov 27 17:20:11 crc kubenswrapper[4707]: I1127 17:20:11.860234 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fblqb/must-gather-l6msz" podUID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerName="copy" containerID="cri-o://da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd" gracePeriod=2 Nov 27 17:20:11 crc kubenswrapper[4707]: I1127 17:20:11.867992 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fblqb/must-gather-l6msz"] Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.344516 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fblqb_must-gather-l6msz_e2bda7b9-df89-4f33-b48b-4930fe74f710/copy/0.log" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.345276 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.419156 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2bda7b9-df89-4f33-b48b-4930fe74f710-must-gather-output\") pod \"e2bda7b9-df89-4f33-b48b-4930fe74f710\" (UID: \"e2bda7b9-df89-4f33-b48b-4930fe74f710\") " Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.419431 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mssnj\" (UniqueName: \"kubernetes.io/projected/e2bda7b9-df89-4f33-b48b-4930fe74f710-kube-api-access-mssnj\") pod \"e2bda7b9-df89-4f33-b48b-4930fe74f710\" (UID: \"e2bda7b9-df89-4f33-b48b-4930fe74f710\") " Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.426802 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bda7b9-df89-4f33-b48b-4930fe74f710-kube-api-access-mssnj" (OuterVolumeSpecName: "kube-api-access-mssnj") pod "e2bda7b9-df89-4f33-b48b-4930fe74f710" (UID: "e2bda7b9-df89-4f33-b48b-4930fe74f710"). InnerVolumeSpecName "kube-api-access-mssnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.521700 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mssnj\" (UniqueName: \"kubernetes.io/projected/e2bda7b9-df89-4f33-b48b-4930fe74f710-kube-api-access-mssnj\") on node \"crc\" DevicePath \"\"" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.576709 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2bda7b9-df89-4f33-b48b-4930fe74f710-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e2bda7b9-df89-4f33-b48b-4930fe74f710" (UID: "e2bda7b9-df89-4f33-b48b-4930fe74f710"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.623892 4707 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2bda7b9-df89-4f33-b48b-4930fe74f710-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.666531 4707 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fblqb_must-gather-l6msz_e2bda7b9-df89-4f33-b48b-4930fe74f710/copy/0.log" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.666844 4707 generic.go:334] "Generic (PLEG): container finished" podID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerID="da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd" exitCode=143 Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.666890 4707 scope.go:117] "RemoveContainer" containerID="da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.666936 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fblqb/must-gather-l6msz" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.687317 4707 scope.go:117] "RemoveContainer" containerID="fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.760332 4707 scope.go:117] "RemoveContainer" containerID="da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd" Nov 27 17:20:12 crc kubenswrapper[4707]: E1127 17:20:12.760758 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd\": container with ID starting with da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd not found: ID does not exist" containerID="da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.760803 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd"} err="failed to get container status \"da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd\": rpc error: code = NotFound desc = could not find container \"da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd\": container with ID starting with da68116461e63a7db76a1a74bf56bfc2556fc1c263007c9268ee4dc92bff32bd not found: ID does not exist" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.760831 4707 scope.go:117] "RemoveContainer" containerID="fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b" Nov 27 17:20:12 crc kubenswrapper[4707]: E1127 17:20:12.761271 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b\": container with ID starting with fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b not found: ID does not exist" containerID="fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b" Nov 27 17:20:12 crc kubenswrapper[4707]: I1127 17:20:12.761306 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b"} err="failed to get container status \"fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b\": rpc error: code = NotFound desc = could not find container \"fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b\": container with ID starting with fc3fdeff6fbf6dca7f846ede8e935eef09347beb40d58dda0d5bfda94123d52b not found: ID does not exist" Nov 27 17:20:13 crc kubenswrapper[4707]: I1127 17:20:13.205610 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bda7b9-df89-4f33-b48b-4930fe74f710" path="/var/lib/kubelet/pods/e2bda7b9-df89-4f33-b48b-4930fe74f710/volumes" Nov 27 17:20:16 crc kubenswrapper[4707]: I1127 17:20:16.195668 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:20:16 crc kubenswrapper[4707]: E1127 17:20:16.196807 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:20:30 crc kubenswrapper[4707]: I1127 17:20:30.195793 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:20:30 crc kubenswrapper[4707]: E1127 17:20:30.196620 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:20:45 crc kubenswrapper[4707]: I1127 17:20:45.207544 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:20:45 crc kubenswrapper[4707]: E1127 17:20:45.208872 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:20:58 crc kubenswrapper[4707]: I1127 17:20:58.195821 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:20:58 crc kubenswrapper[4707]: E1127 17:20:58.198021 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:21:13 crc kubenswrapper[4707]: I1127 17:21:13.195988 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:21:13 crc kubenswrapper[4707]: E1127 17:21:13.197104 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:21:27 crc kubenswrapper[4707]: I1127 17:21:27.196774 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:21:27 crc kubenswrapper[4707]: E1127 17:21:27.198302 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:21:40 crc kubenswrapper[4707]: I1127 17:21:40.195720 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:21:40 crc kubenswrapper[4707]: E1127 17:21:40.196624 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:21:54 crc kubenswrapper[4707]: I1127 17:21:54.195473 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:21:54 crc kubenswrapper[4707]: E1127 17:21:54.196374 4707 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c995m_openshift-machine-config-operator(a83beb0d-8dd1-434a-ace2-933f98e3956f)\"" pod="openshift-machine-config-operator/machine-config-daemon-c995m" podUID="a83beb0d-8dd1-434a-ace2-933f98e3956f" Nov 27 17:22:09 crc kubenswrapper[4707]: I1127 17:22:09.197296 4707 scope.go:117] "RemoveContainer" containerID="2e1b0c0f7b1da7c68aecd1e64e838a92d9529dbeb83303ba51235175d9145fd2" Nov 27 17:22:09 crc kubenswrapper[4707]: I1127 17:22:09.803938 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c995m" event={"ID":"a83beb0d-8dd1-434a-ace2-933f98e3956f","Type":"ContainerStarted","Data":"c3743566ffa4325aef8687cac54492107ca2c87e50b0b4081fe36bb1c097d7f0"} Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.214907 4707 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nnf8p"] Nov 27 17:22:29 crc kubenswrapper[4707]: E1127 17:22:29.215987 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="registry-server" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.216003 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="registry-server" Nov 27 17:22:29 crc kubenswrapper[4707]: E1127 17:22:29.216019 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerName="gather" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.216027 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerName="gather" Nov 27 17:22:29 crc kubenswrapper[4707]: E1127 17:22:29.216040 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="extract-utilities" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.216048 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="extract-utilities" Nov 27 17:22:29 crc kubenswrapper[4707]: E1127 17:22:29.216074 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerName="copy" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.216081 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerName="copy" Nov 27 17:22:29 crc kubenswrapper[4707]: E1127 17:22:29.216102 4707 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="extract-content" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.216109 4707 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="extract-content" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.216348 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a2c4de-6496-46a9-b73d-26c769e34f3f" containerName="registry-server" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.216440 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerName="copy" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.216458 4707 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bda7b9-df89-4f33-b48b-4930fe74f710" containerName="gather" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.219095 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.235497 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnf8p"] Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.252868 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-utilities\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.252978 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbp6\" (UniqueName: \"kubernetes.io/projected/4b0793e3-9b96-4eec-8684-13738a55dd44-kube-api-access-hdbp6\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.253215 4707 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-catalog-content\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.355179 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-utilities\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.355278 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbp6\" (UniqueName: \"kubernetes.io/projected/4b0793e3-9b96-4eec-8684-13738a55dd44-kube-api-access-hdbp6\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.355395 4707 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-catalog-content\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.355850 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-utilities\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.355874 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-catalog-content\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.381934 4707 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbp6\" (UniqueName: \"kubernetes.io/projected/4b0793e3-9b96-4eec-8684-13738a55dd44-kube-api-access-hdbp6\") pod \"certified-operators-nnf8p\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:29 crc kubenswrapper[4707]: I1127 17:22:29.547282 4707 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:30 crc kubenswrapper[4707]: I1127 17:22:30.113002 4707 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnf8p"] Nov 27 17:22:30 crc kubenswrapper[4707]: W1127 17:22:30.115886 4707 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b0793e3_9b96_4eec_8684_13738a55dd44.slice/crio-7101fd9a9f0d2d1f56b2d2ed0dc8c0d91bf1ac482d35d6172688431b398fe2b3 WatchSource:0}: Error finding container 7101fd9a9f0d2d1f56b2d2ed0dc8c0d91bf1ac482d35d6172688431b398fe2b3: Status 404 returned error can't find the container with id 7101fd9a9f0d2d1f56b2d2ed0dc8c0d91bf1ac482d35d6172688431b398fe2b3 Nov 27 17:22:31 crc kubenswrapper[4707]: I1127 17:22:31.028199 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b0793e3-9b96-4eec-8684-13738a55dd44" containerID="cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462" exitCode=0 Nov 27 17:22:31 crc kubenswrapper[4707]: I1127 17:22:31.028324 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnf8p" event={"ID":"4b0793e3-9b96-4eec-8684-13738a55dd44","Type":"ContainerDied","Data":"cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462"} Nov 27 17:22:31 crc kubenswrapper[4707]: I1127 17:22:31.028580 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnf8p" event={"ID":"4b0793e3-9b96-4eec-8684-13738a55dd44","Type":"ContainerStarted","Data":"7101fd9a9f0d2d1f56b2d2ed0dc8c0d91bf1ac482d35d6172688431b398fe2b3"} Nov 27 17:22:31 crc kubenswrapper[4707]: I1127 17:22:31.030469 4707 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 17:22:33 crc kubenswrapper[4707]: I1127 17:22:33.049918 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnf8p" event={"ID":"4b0793e3-9b96-4eec-8684-13738a55dd44","Type":"ContainerStarted","Data":"1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e"} Nov 27 17:22:34 crc kubenswrapper[4707]: I1127 17:22:34.062169 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b0793e3-9b96-4eec-8684-13738a55dd44" containerID="1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e" exitCode=0 Nov 27 17:22:34 crc kubenswrapper[4707]: I1127 17:22:34.062292 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnf8p" event={"ID":"4b0793e3-9b96-4eec-8684-13738a55dd44","Type":"ContainerDied","Data":"1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e"} Nov 27 17:22:36 crc kubenswrapper[4707]: I1127 17:22:36.282445 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnf8p" event={"ID":"4b0793e3-9b96-4eec-8684-13738a55dd44","Type":"ContainerStarted","Data":"454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686"} Nov 27 17:22:36 crc kubenswrapper[4707]: I1127 17:22:36.307417 4707 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nnf8p" podStartSLOduration=3.735814766 podStartE2EDuration="7.307394371s" podCreationTimestamp="2025-11-27 17:22:29 +0000 UTC" firstStartedPulling="2025-11-27 17:22:31.030213788 +0000 UTC m=+4726.661662556" lastFinishedPulling="2025-11-27 17:22:34.601793403 +0000 UTC m=+4730.233242161" observedRunningTime="2025-11-27 17:22:36.297322805 +0000 UTC m=+4731.928771603" watchObservedRunningTime="2025-11-27 17:22:36.307394371 +0000 UTC m=+4731.938843159" Nov 27 17:22:39 crc kubenswrapper[4707]: I1127 17:22:39.547440 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:39 crc kubenswrapper[4707]: I1127 17:22:39.547800 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:39 crc kubenswrapper[4707]: I1127 17:22:39.592928 4707 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:40 crc kubenswrapper[4707]: I1127 17:22:40.392518 4707 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:40 crc kubenswrapper[4707]: I1127 17:22:40.452197 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnf8p"] Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.351512 4707 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nnf8p" podUID="4b0793e3-9b96-4eec-8684-13738a55dd44" containerName="registry-server" containerID="cri-o://454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686" gracePeriod=2 Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.876053 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.921765 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdbp6\" (UniqueName: \"kubernetes.io/projected/4b0793e3-9b96-4eec-8684-13738a55dd44-kube-api-access-hdbp6\") pod \"4b0793e3-9b96-4eec-8684-13738a55dd44\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.921952 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-utilities\") pod \"4b0793e3-9b96-4eec-8684-13738a55dd44\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.922141 4707 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-catalog-content\") pod \"4b0793e3-9b96-4eec-8684-13738a55dd44\" (UID: \"4b0793e3-9b96-4eec-8684-13738a55dd44\") " Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.922762 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-utilities" (OuterVolumeSpecName: "utilities") pod "4b0793e3-9b96-4eec-8684-13738a55dd44" (UID: "4b0793e3-9b96-4eec-8684-13738a55dd44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.922989 4707 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.932085 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0793e3-9b96-4eec-8684-13738a55dd44-kube-api-access-hdbp6" (OuterVolumeSpecName: "kube-api-access-hdbp6") pod "4b0793e3-9b96-4eec-8684-13738a55dd44" (UID: "4b0793e3-9b96-4eec-8684-13738a55dd44"). InnerVolumeSpecName "kube-api-access-hdbp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 17:22:42 crc kubenswrapper[4707]: I1127 17:22:42.978766 4707 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b0793e3-9b96-4eec-8684-13738a55dd44" (UID: "4b0793e3-9b96-4eec-8684-13738a55dd44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.025684 4707 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdbp6\" (UniqueName: \"kubernetes.io/projected/4b0793e3-9b96-4eec-8684-13738a55dd44-kube-api-access-hdbp6\") on node \"crc\" DevicePath \"\"" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.025729 4707 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b0793e3-9b96-4eec-8684-13738a55dd44-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.369507 4707 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnf8p" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.369355 4707 generic.go:334] "Generic (PLEG): container finished" podID="4b0793e3-9b96-4eec-8684-13738a55dd44" containerID="454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686" exitCode=0 Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.369514 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnf8p" event={"ID":"4b0793e3-9b96-4eec-8684-13738a55dd44","Type":"ContainerDied","Data":"454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686"} Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.369686 4707 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnf8p" event={"ID":"4b0793e3-9b96-4eec-8684-13738a55dd44","Type":"ContainerDied","Data":"7101fd9a9f0d2d1f56b2d2ed0dc8c0d91bf1ac482d35d6172688431b398fe2b3"} Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.369749 4707 scope.go:117] "RemoveContainer" containerID="454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.406116 4707 scope.go:117] "RemoveContainer" containerID="1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.409878 4707 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnf8p"] Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.421002 4707 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nnf8p"] Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.780869 4707 scope.go:117] "RemoveContainer" containerID="cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.838260 4707 scope.go:117] "RemoveContainer" containerID="454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686" Nov 27 17:22:43 crc kubenswrapper[4707]: E1127 17:22:43.838797 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686\": container with ID starting with 454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686 not found: ID does not exist" containerID="454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.838857 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686"} err="failed to get container status \"454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686\": rpc error: code = NotFound desc = could not find container \"454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686\": container with ID starting with 454f9a1a230d759856d640299fa86653f148d7140979e70c111614120fd0a686 not found: ID does not exist" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.838891 4707 scope.go:117] "RemoveContainer" containerID="1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e" Nov 27 17:22:43 crc kubenswrapper[4707]: E1127 17:22:43.839443 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e\": container with ID starting with 1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e not found: ID does not exist" containerID="1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.839485 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e"} err="failed to get container status \"1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e\": rpc error: code = NotFound desc = could not find container \"1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e\": container with ID starting with 1cf01f90883f16475811338411dcd2cc38142416a237b0ee97465a5bfa25e34e not found: ID does not exist" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.839511 4707 scope.go:117] "RemoveContainer" containerID="cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462" Nov 27 17:22:43 crc kubenswrapper[4707]: E1127 17:22:43.839809 4707 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462\": container with ID starting with cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462 not found: ID does not exist" containerID="cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462" Nov 27 17:22:43 crc kubenswrapper[4707]: I1127 17:22:43.839835 4707 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462"} err="failed to get container status \"cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462\": rpc error: code = NotFound desc = could not find container \"cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462\": container with ID starting with cf956448ea93c63d6efeff8da739ad48cd8c02b3174ff745275e6d5f310fa462 not found: ID does not exist" Nov 27 17:22:45 crc kubenswrapper[4707]: I1127 17:22:45.211098 4707 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0793e3-9b96-4eec-8684-13738a55dd44" path="/var/lib/kubelet/pods/4b0793e3-9b96-4eec-8684-13738a55dd44/volumes"